initial commit

This commit is contained in:
Jörg Prante 2016-11-25 00:06:17 +01:00
commit b41479bf7f
86 changed files with 7211 additions and 0 deletions

12
.gitignore vendored Normal file
View file

@ -0,0 +1,12 @@
/data
/work
/logs
/.idea
/target
.DS_Store
*.iml
/.settings
/.classpath
/.project
/.gradle
/build

7
.travis.yml Normal file
View file

@ -0,0 +1,7 @@
sudo: false
language: java
jdk:
- oraclejdk8
cache:
directories:
- $HOME/.m2

202
LICENSE.txt Normal file
View file

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

8
README.adoc Normal file
View file

@ -0,0 +1,8 @@
# xbib Contextual Query Language Compiler
image:https://api.travis-ci.org/xbib/cql.svg[title="Build status", link="https://travis-ci.org/xbib/cql/"]
image:https://img.shields.io/sonar/http/nemo.sonarqube.com/org.xbib%3Acql/coverage.svg?style=flat-square[title="Coverage", link="https://sonarqube.com/dashboard/index?id=org.xbib%3Acql"]
image:https://maven-badges.herokuapp.com/maven-central/org.xbib/cql/badge.svg[title="Maven Central", link="http://search.maven.org/#search%7Cga%7C1%7Cxbib%20cql"]
image:https://img.shields.io/badge/License-Apache%202.0-blue.svg[title="Apache License 2.0", link="https://opensource.org/licenses/Apache-2.0"]
image:https://img.shields.io/twitter/url/https/twitter.com/xbib.svg?style=social&label=Follow%20%40xbib[title="Twitter", link="https://twitter.com/xbib"]
image:https://www.paypalobjects.com/en_US/i/btn/btn_donateCC_LG.gif[title="PayPal", link="https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=GVHFQYZ9WZ8HG"]

65
build.gradle Normal file
View file

@ -0,0 +1,65 @@
plugins {
id 'org.xbib.gradle.plugin.jflex' version '1.1.0'
id 'org.xbib.gradle.plugin.jacc' version '1.1.3'
id "org.sonarqube" version '2.2'
}
group = 'org.xbib'
version = '1.0.0'
apply plugin: 'java'
apply plugin: 'maven'
apply plugin: 'signing'
apply plugin: 'findbugs'
apply plugin: 'pmd'
apply plugin: 'checkstyle'
apply plugin: 'jacoco'
repositories {
mavenCentral()
}
configurations {
wagon
}
dependencies {
compile 'org.xbib:content-core:1.0.6'
testCompile 'junit:junit:4.12'
wagon 'org.apache.maven.wagon:wagon-ssh-external:2.10'
}
sourceCompatibility = JavaVersion.VERSION_1_8
targetCompatibility = JavaVersion.VERSION_1_8
[compileJava, compileTestJava]*.options*.encoding = 'UTF-8'
tasks.withType(JavaCompile) {
options.compilerArgs << "-Xlint:all" << "-profile" << "compact1"
}
test {
testLogging {
showStandardStreams = false
exceptionFormat = 'full'
}
}
task sourcesJar(type: Jar, dependsOn: classes) {
classifier 'sources'
from sourceSets.main.allSource
}
task javadocJar(type: Jar, dependsOn: javadoc) {
classifier 'javadoc'
}
artifacts {
archives sourcesJar, javadocJar
}
if (project.hasProperty('signing.keyId')) {
signing {
sign configurations.archives
}
}
apply from: 'gradle/ext.gradle'
apply from: 'gradle/publish.gradle'
apply from: 'gradle/sonarqube.gradle'

View file

@ -0,0 +1,323 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE module PUBLIC
"-//Puppy Crawl//DTD Check Configuration 1.3//EN"
"http://www.puppycrawl.com/dtds/configuration_1_3.dtd">
<!-- This is a checkstyle configuration file. For descriptions of
what the following rules do, please see the checkstyle configuration
page at http://checkstyle.sourceforge.net/config.html -->
<module name="Checker">
<module name="FileTabCharacter">
<!-- Checks that there are no tab characters in the file.
-->
</module>
<module name="NewlineAtEndOfFile">
<property name="lineSeparator" value="lf"/>
</module>
<module name="RegexpSingleline">
<!-- Checks that FIXME is not used in comments. TODO is preferred.
-->
<property name="format" value="((//.*)|(\*.*))FIXME" />
<property name="message" value='TODO is preferred to FIXME. e.g. "TODO(johndoe): Refactor when v2 is released."' />
</module>
<module name="RegexpSingleline">
<!-- Checks that TODOs are named. (Actually, just that they are followed
by an open paren.)
-->
<property name="format" value="((//.*)|(\*.*))TODO[^(]" />
<property name="message" value='All TODOs should be named. e.g. "TODO(johndoe): Refactor when v2 is released."' />
</module>
<module name="JavadocPackage">
<!-- Checks that each Java package has a Javadoc file used for commenting.
Only allows a package-info.java, not package.html. -->
</module>
<!-- All Java AST specific tests live under TreeWalker module. -->
<module name="TreeWalker">
<!--
IMPORT CHECKS
-->
<module name="RedundantImport">
<!-- Checks for redundant import statements. -->
<property name="severity" value="error"/>
</module>
<module name="ImportOrder">
<!-- Checks for out of order import statements. -->
<property name="severity" value="warning"/>
<property name="groups" value="com,junit,net,org,java,javax"/>
<!-- This ensures that static imports go first. -->
<property name="option" value="top"/>
<property name="tokens" value="STATIC_IMPORT, IMPORT"/>
</module>
<!--
JAVADOC CHECKS
-->
<!-- Checks for Javadoc comments. -->
<!-- See http://checkstyle.sf.net/config_javadoc.html -->
<module name="JavadocMethod">
<property name="scope" value="protected"/>
<property name="severity" value="warning"/>
<property name="allowMissingJavadoc" value="true"/>
<property name="allowMissingParamTags" value="true"/>
<property name="allowMissingReturnTag" value="true"/>
<property name="allowMissingThrowsTags" value="true"/>
<property name="allowThrowsTagsForSubclasses" value="true"/>
<property name="allowUndeclaredRTE" value="true"/>
</module>
<module name="JavadocType">
<property name="scope" value="protected"/>
<property name="severity" value="error"/>
</module>
<module name="JavadocStyle">
<property name="severity" value="warning"/>
</module>
<!--
NAMING CHECKS
-->
<!-- Item 38 - Adhere to generally accepted naming conventions -->
<module name="PackageName">
<!-- Validates identifiers for package names against the
supplied expression. -->
<!-- Here the default checkstyle rule restricts package name parts to
seven characters, this is not in line with common practice at Google.
-->
<property name="format" value="^[a-z]+(\.[a-z][a-z0-9]{1,})*$"/>
<property name="severity" value="warning"/>
</module>
<module name="TypeNameCheck">
<!-- Validates static, final fields against the
expression "^[A-Z][a-zA-Z0-9]*$". -->
<metadata name="altname" value="TypeName"/>
<property name="severity" value="warning"/>
</module>
<module name="ConstantNameCheck">
<!-- Validates non-private, static, final fields against the supplied
public/package final fields "^[A-Z][A-Z0-9]*(_[A-Z0-9]+)*$". -->
<metadata name="altname" value="ConstantName"/>
<property name="applyToPublic" value="true"/>
<property name="applyToProtected" value="true"/>
<property name="applyToPackage" value="true"/>
<property name="applyToPrivate" value="false"/>
<property name="format" value="^([A-Z][A-Z0-9]*(_[A-Z0-9]+)*|FLAG_.*)$"/>
<message key="name.invalidPattern"
value="Variable ''{0}'' should be in ALL_CAPS (if it is a constant) or be private (otherwise)."/>
<property name="severity" value="warning"/>
</module>
<module name="StaticVariableNameCheck">
<!-- Validates static, non-final fields against the supplied
expression "^[a-z][a-zA-Z0-9]*_?$". -->
<metadata name="altname" value="StaticVariableName"/>
<property name="applyToPublic" value="true"/>
<property name="applyToProtected" value="true"/>
<property name="applyToPackage" value="true"/>
<property name="applyToPrivate" value="true"/>
<property name="format" value="^[a-z][a-zA-Z0-9]*_?$"/>
<property name="severity" value="warning"/>
</module>
<module name="MemberNameCheck">
<!-- Validates non-static members against the supplied expression. -->
<metadata name="altname" value="MemberName"/>
<property name="applyToPublic" value="true"/>
<property name="applyToProtected" value="true"/>
<property name="applyToPackage" value="true"/>
<property name="applyToPrivate" value="true"/>
<property name="format" value="^[a-z][a-zA-Z0-9]*$"/>
<property name="severity" value="warning"/>
</module>
<module name="MethodNameCheck">
<!-- Validates identifiers for method names. -->
<metadata name="altname" value="MethodName"/>
<property name="format" value="^[a-z][a-zA-Z0-9]*(_[a-zA-Z0-9]+)*$"/>
<property name="severity" value="warning"/>
</module>
<module name="ParameterName">
<!-- Validates identifiers for method parameters against the
expression "^[a-z][a-zA-Z0-9]*$". -->
<property name="severity" value="warning"/>
</module>
<module name="LocalFinalVariableName">
<!-- Validates identifiers for local final variables against the
expression "^[a-z][a-zA-Z0-9]*$". -->
<property name="severity" value="warning"/>
</module>
<module name="LocalVariableName">
<!-- Validates identifiers for local variables against the
expression "^[a-z][a-zA-Z0-9]*$". -->
<property name="severity" value="warning"/>
</module>
<!--
LENGTH and CODING CHECKS
-->
<module name="LineLength">
<!-- Checks if a line is too long. -->
<property name="max" value="${com.puppycrawl.tools.checkstyle.checks.sizes.LineLength.max}" default="128"/>
<property name="severity" value="error"/>
<!--
The default ignore pattern exempts the following elements:
- import statements
- long URLs inside comments
-->
<property name="ignorePattern"
value="${com.puppycrawl.tools.checkstyle.checks.sizes.LineLength.ignorePattern}"
default="^(package .*;\s*)|(import .*;\s*)|( *(\*|//).*https?://.*)$"/>
</module>
<module name="LeftCurly">
<!-- Checks for placement of the left curly brace ('{'). -->
<property name="severity" value="warning"/>
</module>
<module name="RightCurly">
<!-- Checks right curlies on CATCH, ELSE, and TRY blocks are on
the same line. e.g., the following example is fine:
<pre>
if {
...
} else
</pre>
-->
<!-- This next example is not fine:
<pre>
if {
...
}
else
</pre>
-->
<property name="option" value="same"/>
<property name="severity" value="warning"/>
</module>
<!-- Checks for braces around if and else blocks -->
<module name="NeedBraces">
<property name="severity" value="warning"/>
<property name="tokens" value="LITERAL_IF, LITERAL_ELSE, LITERAL_FOR, LITERAL_WHILE, LITERAL_DO"/>
</module>
<module name="UpperEll">
<!-- Checks that long constants are defined with an upper ell.-->
<property name="severity" value="error"/>
</module>
<module name="FallThrough">
<!-- Warn about falling through to the next case statement. Similar to
javac -Xlint:fallthrough, but the check is suppressed if a single-line comment
on the last non-blank line preceding the fallen-into case contains 'fall through' (or
some other variants which we don't publicized to promote consistency).
-->
<property name="reliefPattern"
value="fall through|Fall through|fallthru|Fallthru|falls through|Falls through|fallthrough|Fallthrough|No break|NO break|no break|continue on"/>
<property name="severity" value="error"/>
</module>
<!--
MODIFIERS CHECKS
-->
<module name="ModifierOrder">
<!-- Warn if modifier order is inconsistent with JLS3 8.1.1, 8.3.1, and
8.4.3. The prescribed order is:
public, protected, private, abstract, static, final, transient, volatile,
synchronized, native, strictfp
-->
</module>
<!--
WHITESPACE CHECKS
-->
<module name="WhitespaceAround">
<!-- Checks that various tokens are surrounded by whitespace.
This includes most binary operators and keywords followed
by regular or curly braces.
-->
<property name="tokens" value="ASSIGN, BAND, BAND_ASSIGN, BOR,
BOR_ASSIGN, BSR, BSR_ASSIGN, BXOR, BXOR_ASSIGN, COLON, DIV, DIV_ASSIGN,
EQUAL, GE, GT, LAND, LE, LITERAL_CATCH, LITERAL_DO, LITERAL_ELSE,
LITERAL_FINALLY, LITERAL_FOR, LITERAL_IF, LITERAL_RETURN,
LITERAL_SYNCHRONIZED, LITERAL_TRY, LITERAL_WHILE, LOR, LT, MINUS,
MINUS_ASSIGN, MOD, MOD_ASSIGN, NOT_EQUAL, PLUS, PLUS_ASSIGN, QUESTION,
SL, SL_ASSIGN, SR_ASSIGN, STAR, STAR_ASSIGN"/>
<property name="severity" value="error"/>
</module>
<module name="WhitespaceAfter">
<!-- Checks that commas, semicolons and typecasts are followed by
whitespace.
-->
<property name="tokens" value="COMMA, SEMI, TYPECAST"/>
</module>
<module name="NoWhitespaceAfter">
<!-- Checks that there is no whitespace after various unary operators.
Linebreaks are allowed.
-->
<property name="tokens" value="BNOT, DEC, DOT, INC, LNOT, UNARY_MINUS,
UNARY_PLUS"/>
<property name="allowLineBreaks" value="true"/>
<property name="severity" value="error"/>
</module>
<module name="NoWhitespaceBefore">
<!-- Checks that there is no whitespace before various unary operators.
Linebreaks are allowed.
-->
<property name="tokens" value="SEMI, DOT, POST_DEC, POST_INC"/>
<property name="allowLineBreaks" value="true"/>
<property name="severity" value="error"/>
</module>
<module name="ParenPad">
<!-- Checks that there is no whitespace before close parens or after
open parens.
-->
<property name="severity" value="warning"/>
</module>
</module>
</module>

8
gradle/ext.gradle Normal file
View file

@ -0,0 +1,8 @@
ext {
user = 'xbib'
projectName = 'cql'
projectDescription = 'Contextual Query Language compiler for Java'
scmUrl = 'https://github.com/xbib/cql'
scmConnection = 'scm:git:git://github.com/xbib/cql.git'
scmDeveloperConnection = 'scm:git:git://github.com/xbib/cql.git'
}

66
gradle/publish.gradle Normal file
View file

@ -0,0 +1,66 @@
task xbibUpload(type: Upload, dependsOn: build) {
configuration = configurations.archives
uploadDescriptor = true
repositories {
if (project.hasProperty('xbibUsername')) {
mavenDeployer {
configuration = configurations.wagon
repository(url: uri('scpexe://xbib.org/repository')) {
authentication(userName: xbibUsername, privateKey: xbibPrivateKey)
}
}
}
}
}
task sonatypeUpload(type: Upload, dependsOn: build) {
configuration = configurations.archives
uploadDescriptor = true
repositories {
if (project.hasProperty('ossrhUsername')) {
mavenDeployer {
beforeDeployment { MavenDeployment deployment -> signing.signPom(deployment) }
repository(url: uri(ossrhReleaseUrl)) {
authentication(userName: ossrhUsername, password: ossrhPassword)
}
snapshotRepository(url: uri(ossrhSnapshotUrl)) {
authentication(userName: ossrhUsername, password: ossrhPassword)
}
pom.project {
groupId project.group
artifactId project.name
version project.version
name project.name
description projectDescription
packaging 'jar'
inceptionYear '2016'
url scmUrl
organization {
name 'xbib'
url 'http://xbib.org'
}
developers {
developer {
id user
name 'Jörg Prante'
email 'joergprante@gmail.com'
url 'https://github.com/jprante'
}
}
scm {
url scmUrl
connection scmConnection
developerConnection scmDeveloperConnection
}
licenses {
license {
name 'The Apache License, Version 2.0'
url 'http://www.apache.org/licenses/LICENSE-2.0.txt'
}
}
}
}
}
}
}

41
gradle/sonarqube.gradle Normal file
View file

@ -0,0 +1,41 @@
tasks.withType(FindBugs) {
ignoreFailures = true
reports {
xml.enabled = true
html.enabled = false
}
}
tasks.withType(Pmd) {
ignoreFailures = true
reports {
xml.enabled = true
html.enabled = true
}
}
tasks.withType(Checkstyle) {
ignoreFailures = true
reports {
xml.enabled = true
html.enabled = true
}
}
jacocoTestReport {
reports {
xml.enabled true
csv.enabled false
xml.destination "${buildDir}/reports/jacoco-xml"
html.destination "${buildDir}/reports/jacoco-html"
}
}
sonarqube {
properties {
property "sonar.projectName", "${project.group} ${project.name}"
property "sonar.sourceEncoding", "UTF-8"
property "sonar.tests", "src/test/java"
property "sonar.scm.provider", "git"
property "sonar.java.coveragePlugin", "jacoco"
property "sonar.junit.reportsPath", "build/test-results/test/"
}
}

BIN
gradle/wrapper/gradle-wrapper.jar vendored Normal file

Binary file not shown.

View file

@ -0,0 +1,6 @@
#Thu Nov 24 21:44:19 CET 2016
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-2.13-bin.zip

164
gradlew vendored Executable file
View file

@ -0,0 +1,164 @@
#!/usr/bin/env bash
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn ( ) {
echo "$*"
}
die ( ) {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Split up the JVM_OPTS And GRADLE_OPTS values into an array, following the shell quoting and substitution rules
function splitJvmOpts() {
JVM_OPTS=("$@")
}
eval splitJvmOpts $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS
JVM_OPTS[${#JVM_OPTS[*]}]="-Dorg.gradle.appname=$APP_BASE_NAME"
exec "$JAVACMD" "${JVM_OPTS[@]}" -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"

90
gradlew.bat vendored Normal file
View file

@ -0,0 +1,90 @@
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

1
settings.gradle Normal file
View file

@ -0,0 +1 @@
rootProject.name = 'cql'

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,26 @@
package org.xbib.cql;
/**
* This abstract node class is the base class for the CQL abstract syntax tree.
*/
public abstract class AbstractNode implements Node {
/**
* Try to accept this node by a visitor.
*
* @param visitor the visitor
*/
@Override
public abstract void accept(Visitor visitor);
/**
* Compare this node to another node.
*/
@Override
public int compareTo(Node object) {
if (this == object) {
return 0;
}
return toString().compareTo(object.toString());
}
}

View file

@ -0,0 +1,38 @@
package org.xbib.cql;
/**
* Abstract syntax tree of CQL - Boolean Group.
*/
public class BooleanGroup extends AbstractNode {
private BooleanOperator op;
private ModifierList modifiers;
BooleanGroup(BooleanOperator op, ModifierList modifiers) {
this.op = op;
this.modifiers = modifiers;
}
BooleanGroup(BooleanOperator op) {
this.op = op;
}
public BooleanOperator getOperator() {
return op;
}
public ModifierList getModifierList() {
return modifiers;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return op != null && modifiers != null ? op + modifiers.toString()
: op != null ? op.toString() : null;
}
}

View file

@ -0,0 +1,80 @@
package org.xbib.cql;
import java.util.HashMap;
import java.util.Map;
/**
* Abstract syntax tree of CQL - boolean operator enumeration.
*/
public enum BooleanOperator {
AND("and"),
OR("or"),
NOT("not"),
PROX("prox");
/**
* Token/operator map.
*/
private static Map<String, BooleanOperator> tokenMap;
/**
* Operator/token map.
*/
private static Map<BooleanOperator, String> opMap;
private String token;
/**
* Creates a new Operator object.
*
* @param token the operator token
*/
BooleanOperator(String token) {
this.token = token;
map(token, this);
}
/**
* Map token to operator.
*
* @param token the token
* @param op the operator
*/
private static void map(String token, BooleanOperator op) {
if (tokenMap == null) {
tokenMap = new HashMap<>();
}
tokenMap.put(token, op);
if (opMap == null) {
opMap = new HashMap<>();
}
opMap.put(op, token);
}
/**
* Get token.
*
* @return the token
*/
public String getToken() {
return token;
}
/**
* Get operator for token.
*
* @param token the token
* @return the operator
*/
static BooleanOperator forToken(Object token) {
return tokenMap.get(token.toString().toLowerCase());
}
/**
* Write operator representation.
*
* @return the operator token
*/
@Override
public String toString() {
return token;
}
}

View file

@ -0,0 +1,224 @@
package org.xbib.cql;
import org.xbib.cql.model.CQLQueryModel;
import org.xbib.cql.model.Facet;
import org.xbib.cql.model.Filter;
import org.xbib.cql.model.Option;
/**
* This is a CQL abstract syntax tree generator useful for normalizing CQL queries.
*/
public final class CQLGenerator implements Visitor {
/**
* helper for managing our CQL query model (facet/filter/option contexts, breadcrumb trails etc.).
*/
private CQLQueryModel model;
/**
* A replacement string.
*/
private String replacementString;
/**
* String to be replaced.
*/
private String stringToBeReplaced;
public CQLGenerator() {
this.replacementString = null;
this.stringToBeReplaced = null;
this.model = new CQLQueryModel();
}
public CQLGenerator model(CQLQueryModel model) {
this.model = model;
return this;
}
public CQLQueryModel getModel() {
return model;
}
public String getResult() {
return model.getQuery();
}
@Override
public void visit(SortedQuery node) {
if (node.getSortSpec() != null) {
node.getSortSpec().accept(this);
}
if (node.getQuery() != null) {
node.getQuery().accept(this);
}
model.setQuery(node.toString());
}
@Override
public void visit(Query node) {
if (node.getPrefixAssignments() != null) {
for (PrefixAssignment assignment : node.getPrefixAssignments()) {
assignment.accept(this);
}
}
if (node.getQuery() != null) {
node.getQuery().accept(this);
}
if (node.getScopedClause() != null) {
node.getScopedClause().accept(this);
}
}
@Override
public void visit(SortSpec node) {
if (node.getSingleSpec() != null) {
node.getSingleSpec().accept(this);
}
if (node.getSortSpec() != null) {
node.getSortSpec().accept(this);
}
}
@Override
public void visit(SingleSpec node) {
if (node.getIndex() != null) {
node.getIndex().accept(this);
}
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
}
@Override
public void visit(PrefixAssignment node) {
node.getPrefix().accept(this);
node.getURI().accept(this);
}
@Override
public void visit(ScopedClause node) {
if (node.getScopedClause() != null) {
node.getScopedClause().accept(this);
}
node.getSearchClause().accept(this);
if (node.getBooleanGroup() != null) {
node.getBooleanGroup().accept(this);
BooleanOperator op = node.getBooleanGroup().getOperator();
checkFilter(op, node);
checkFilter(op, node.getScopedClause());
}
}
@Override
public void visit(BooleanGroup node) {
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
}
@Override
public void visit(SearchClause node) {
if (node.getQuery() != null) {
node.getQuery().accept(this);
}
if (node.getTerm() != null) {
node.getTerm().accept(this);
}
if (node.getIndex() != null) {
node.getIndex().accept(this);
String context = node.getIndex().getContext();
if (CQLQueryModel.FACET_INDEX_NAME.equals(context)) {
Facet<Term> facet = new Facet<>(node.getIndex().getName());
facet.setValue(node.getTerm());
model.addFacet(facet);
} else if (CQLQueryModel.OPTION_INDEX_NAME.equals(context)) {
Option<Term> option = new Option<>();
option.setName(node.getIndex().getName());
option.setValue(node.getTerm());
model.addOption(option);
}
}
if (node.getRelation() != null) {
node.getRelation().accept(this);
}
}
@Override
public void visit(Relation node) {
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
}
@Override
public void visit(Modifier node) {
if (node.getTerm() != null) {
node.getTerm().accept(this);
}
if (node.getName() != null) {
node.getName().accept(this);
}
}
@Override
public void visit(ModifierList node) {
for (Modifier modifier : node.getModifierList()) {
modifier.accept(this);
}
}
@Override
public void visit(Term node) {
if (replacementString != null && stringToBeReplaced.equals(node.getValue())) {
node.setValue(replacementString);
}
}
@Override
public void visit(Identifier node) {
}
@Override
public void visit(SimpleName node) {
}
@Override
public void visit(Index node) {
}
/**
* Write a substitution query, for example when a term has been
* suggested to be replaced by another term.
*
* @param oldTerm the term to be replaced
* @param newTerm the replacement term
* @return the new query with the term replaced
*/
public synchronized String writeSubstitutedForm(String oldTerm, String newTerm) {
this.stringToBeReplaced = oldTerm;
this.replacementString = newTerm;
CQLParser parser = new CQLParser(model.getQuery());
parser.parse();
parser.getCQLQuery().accept(this);
String result = model.getQuery();
this.stringToBeReplaced = null;
this.replacementString = null;
return result;
}
public String withBreadcrumbs() {
return model.toCQL();
}
private void checkFilter(BooleanOperator op, ScopedClause node) {
if (node.getSearchClause().getIndex() != null
&& CQLQueryModel.FILTER_INDEX_NAME.equals(node.getSearchClause().getIndex().getContext())) {
String filtername = node.getSearchClause().getIndex().getName();
Comparitor filterop = node.getSearchClause().getRelation().getComparitor();
Term filterterm = node.getSearchClause().getTerm();
Filter<AbstractNode> filter2 = new Filter<>(filtername, filterterm, filterop);
model.addFilter(op, filter2);
}
}
}

View file

@ -0,0 +1,80 @@
package org.xbib.cql;
import java.util.HashMap;
/**
* CQL operators.
*/
public enum Comparitor {
EQUALS("="),
GREATER(">"),
GREATER_EQUALS(">="),
LESS("<"),
LESS_EQUALS("<="),
NOT_EQUALS("<>"),
WITHIN("within"),
CQLWITHIN("cql.within"),
ENCLOSES("encloses"),
CQLENCLOSES("cql.encloses"),
ADJ("adj"),
CQLADJ("cql.adj"),
ALL("all"),
CQLALL("cql.all"),
ANY("any"),
CQLANY("cql.any");
private static HashMap<String, Comparitor> tokenMap;
private String token;
/**
* Creates a new Operator object.
*
* @param token the operator token
*/
private Comparitor(String token) {
this.token = token;
map(token, this);
}
/**
* Map token to operator
*
* @param token the token
* @param op the operator
*/
private static void map(String token, Comparitor op) {
if (tokenMap == null) {
tokenMap = new HashMap<>();
}
tokenMap.put(token, op);
}
/**
* Get token.
*
* @return the token
*/
public String getToken() {
return token;
}
/**
* Get operator for token.
*
* @param token the token
* @return the operator
*/
static Comparitor forToken(Object token) {
return tokenMap.get(token.toString());
}
/**
* Write operator representation.
*
* @return the operator token
*/
@Override
public String toString() {
return token;
}
}

View file

@ -0,0 +1,33 @@
package org.xbib.cql;
/**
* An Identifier is a SimpleName or a String in double quotes.
*/
public class Identifier extends AbstractNode {
private String value;
private boolean quoted;
public Identifier(String value) {
this.value = value;
this.quoted = true;
}
public Identifier(SimpleName name) {
this.value = name.getName();
this.quoted = false;
}
public String getValue() {
return value;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return value != null && quoted ? "\"" + value.replaceAll("\"", "\\\\\"") + "\"" : value;
}
}

View file

@ -0,0 +1,51 @@
package org.xbib.cql;
/**
* Abstract syntax tree of CQL - Index.
* The Index consists of <b>context</b> and <b>name</b>
* The default context is "cql" and is of the same concept like a namespace.
*/
public class Index extends AbstractNode {
private String context;
private String name;
public Index(String name) {
this.name = name;
int pos = name.indexOf('.');
if (pos > 0) {
this.context = name.substring(0, pos);
this.name = name.substring(pos + 1);
}
}
public Index(SimpleName name) {
this(name.getName());
}
/**
* @return the context of the index
*/
public String getContext() {
return context;
}
/**
* Get the name of the index
*
* @return the name of the index
*/
public String getName() {
return name;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return context != null ? context + "." + name : name;
}
}

View file

@ -0,0 +1,45 @@
package org.xbib.cql;
/**
* Modifier.
*/
public class Modifier extends AbstractNode {
private SimpleName name;
private Comparitor op;
private Term term;
public Modifier(SimpleName name, Comparitor op, Term term) {
this.name = name;
this.op = op;
this.term = term;
}
public Modifier(SimpleName name) {
this.name = name;
}
public SimpleName getName() {
return name;
}
public Comparitor getOperator() {
return op;
}
public Term getTerm() {
return term;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return "/" + (term != null ? name.toString() + op + term : name.toString());
}
}

View file

@ -0,0 +1,39 @@
package org.xbib.cql;
import java.util.LinkedList;
import java.util.List;
/**
* Modifier list. This is a recursive data structure with a Modifier and optionally a ModifierList.
*/
public class ModifierList extends AbstractNode {
private List<Modifier> modifierList = new LinkedList<>();
public ModifierList(ModifierList modifiers, Modifier modifier) {
modifierList.addAll(modifiers.modifierList);
modifierList.add(modifier);
}
public ModifierList(Modifier modifier) {
modifierList.add(modifier);
}
public List<Modifier> getModifierList() {
return modifierList;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
for (Modifier m : modifierList) {
sb.append(m.toString());
}
return sb.toString();
}
}

View file

@ -0,0 +1,14 @@
package org.xbib.cql;
/**
* This is a node interface for the CQL abstract syntax tree.
*/
public interface Node extends Comparable<Node> {
/**
* Accept a visitor on this node.
*
* @param visitor the visitor
*/
void accept(Visitor visitor);
}

View file

@ -0,0 +1,38 @@
package org.xbib.cql;
/**
* Prefix assignment.
*/
public class PrefixAssignment extends AbstractNode {
private Term prefix;
private Term uri;
public PrefixAssignment(Term prefix, Term uri) {
this.prefix = prefix;
this.uri = uri;
}
public PrefixAssignment(Term uri) {
this.uri = uri;
}
public Term getPrefix() {
return prefix;
}
public Term getURI() {
return uri;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return "> " + prefix + " = " + uri;
}
}

View file

@ -0,0 +1,57 @@
package org.xbib.cql;
import java.util.LinkedList;
import java.util.List;
/**
* CQL query.
*/
public class Query extends AbstractNode {
private List<PrefixAssignment> prefixes = new LinkedList<>();
private Query query;
private ScopedClause clause;
Query(PrefixAssignment assignment, Query query) {
prefixes.add(assignment);
this.query = query;
}
Query(ScopedClause clause) {
this.clause = clause;
}
public List<PrefixAssignment> getPrefixAssignments() {
return prefixes;
}
public Query getQuery() {
return query;
}
public ScopedClause getScopedClause() {
return clause;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
for (PrefixAssignment assignment : prefixes) {
sb.append(assignment.toString()).append(' ');
}
if (query != null) {
sb.append(query);
}
if (clause != null) {
sb.append(clause);
}
return sb.toString();
}
}

View file

@ -0,0 +1,21 @@
package org.xbib.cql;
/**
* Query facet.
*/
public interface QueryFacet<V> extends QueryOption<V> {
/**
* The size of the facet.
*
* @return the facet size
*/
int getSize();
/**
* Get the filter name which must be used for filtering facet entries.
*
* @return the filter name
*/
String getFilterName();
}

View file

@ -0,0 +1,7 @@
package org.xbib.cql;
/**
* A Filter for a query.
*/
public interface QueryFilter<V> extends QueryOption<V> {
}

View file

@ -0,0 +1,16 @@
package org.xbib.cql;
/**
* Qery option.
* @param <V> parameter type
*/
public interface QueryOption<V> {
void setName(String name);
String getName();
void setValue(V value);
V getValue();
}

View file

@ -0,0 +1,39 @@
package org.xbib.cql;
/**
* Relation to a ModifierList.
*/
public class Relation extends AbstractNode {
private Comparitor comparitor;
private ModifierList modifiers;
public Relation(Comparitor comparitor, ModifierList modifiers) {
this.comparitor = comparitor;
this.modifiers = modifiers;
}
public Relation(Comparitor comparitor) {
this.comparitor = comparitor;
}
public Comparitor getComparitor() {
return comparitor;
}
public ModifierList getModifierList() {
return modifiers;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return modifiers != null ? comparitor + modifiers.toString()
: comparitor.toString();
}
}

View file

@ -0,0 +1,50 @@
package org.xbib.cql;
/**
* Scoped clause. This is a recursive data structure with a SearchClause and
* optionally a ScopedClause.
* SearchClause and ScopedClause are connected through a BooleanGroup.
*/
public class ScopedClause extends AbstractNode {
private ScopedClause clause;
private BooleanGroup booleangroup;
private SearchClause search;
ScopedClause(ScopedClause clause, BooleanGroup bg, SearchClause search) {
this.clause = clause;
this.booleangroup = bg;
this.search = search;
}
ScopedClause(SearchClause search) {
this.search = search;
}
public ScopedClause getScopedClause() {
return clause;
}
public BooleanGroup getBooleanGroup() {
return booleangroup;
}
public SearchClause getSearchClause() {
return search;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
String s = search.toString();
boolean hasQuery = s.length() > 0;
return clause != null && hasQuery ? clause + " " + booleangroup + " " + search
: clause != null ? clause.toString()
: hasQuery ? search.toString()
: "";
}
}

View file

@ -0,0 +1,62 @@
package org.xbib.cql;
import org.xbib.cql.model.CQLQueryModel;
/**
* Search clause.
*/
public class SearchClause extends AbstractNode {
private Query query;
private Index index;
private Relation relation;
private Term term;
SearchClause(Query query) {
this.query = query;
}
SearchClause(Index index, Relation relation, Term term) {
this.index = index;
this.relation = relation;
this.term = term;
}
SearchClause(Term term) {
this.term = term;
}
public Query getQuery() {
return query;
}
public Index getIndex() {
return index;
}
public Term getTerm() {
return term;
}
public Relation getRelation() {
return relation;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
/**
* @return CQL string
*/
@Override
public String toString() {
return query != null && query.toString().length() > 0 ? "(" + query + ")"
: query != null ? ""
: index != null && !CQLQueryModel.isVisible(index.getContext()) ? ""
: index != null ? index + " " + relation + " " + term
: term != null ? term.toString()
: null;
}
}

View file

@ -0,0 +1,28 @@
package org.xbib.cql;
/**
* A SimpleName consists of a String which is not surrounded by double quotes.
*/
public class SimpleName extends AbstractNode {
private String name;
public SimpleName(String name) {
this.name = name;
}
public String getName() {
return name;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return name;
}
}

View file

@ -0,0 +1,37 @@
package org.xbib.cql;
/**
* Single spec.
*/
public class SingleSpec extends AbstractNode {
private Index index;
private ModifierList modifiers;
public SingleSpec(Index index, ModifierList modifiers) {
this.index = index;
this.modifiers = modifiers;
}
public SingleSpec(Index index) {
this.index = index;
}
public Index getIndex() {
return index;
}
public ModifierList getModifierList() {
return modifiers;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return index + (modifiers != null ? modifiers.toString() : "");
}
}

View file

@ -0,0 +1,38 @@
package org.xbib.cql;
/**
* Abstract syntax tree of CQL, the sort specification.
*/
public class SortSpec extends AbstractNode {
private SortSpec sortspec;
private SingleSpec spec;
public SortSpec(SortSpec sortspec, SingleSpec spec) {
this.sortspec = sortspec;
this.spec = spec;
}
public SortSpec(SingleSpec spec) {
this.spec = spec;
}
public SortSpec getSortSpec() {
return sortspec;
}
public SingleSpec getSingleSpec() {
return spec;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return (sortspec != null ? sortspec + " " : "") + spec;
}
}

View file

@ -0,0 +1,39 @@
package org.xbib.cql;
/**
* Sorted query.
*/
public class SortedQuery extends AbstractNode {
private Query query;
private SortSpec spec;
SortedQuery(Query query, SortSpec spec) {
this.query = query;
this.spec = spec;
}
SortedQuery(Query query) {
this.query = query;
}
public Query getQuery() {
return query;
}
public SortSpec getSortSpec() {
return spec;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return query != null && spec != null ? query + " sortby " + spec
: query != null ? query.toString() : "";
}
}

View file

@ -0,0 +1,25 @@
package org.xbib.cql;
/**
* CQL Syntax exception.
*/
public class SyntaxException extends RuntimeException {
/**
* Creates a new SyntaxException object.
*
* @param msg the message for this syntax exception
*/
public SyntaxException(String msg) {
super(msg);
}
/**
* Creates a new SyntaxException object.
*
* @param msg the message for this syntax exception
* @param t the throwable for this syntax exception
*/
public SyntaxException(String msg, Throwable t) {
super(msg, t);
}
}

View file

@ -0,0 +1,147 @@
package org.xbib.cql;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.TimeZone;
/**
* A CQL Term.
*/
public class Term extends AbstractNode {
private static final TimeZone tz = TimeZone.getTimeZone("GMT");
private static final String ISO_FORMAT_SECONDS = "yyyy-MM-dd'T'HH:mm:ss'Z'";
private static final String ISO_FORMAT_DAYS = "yyyy-MM-dd";
private String value;
private Long longvalue;
private Double doublevalue;
private Identifier identifier;
private Date datevalue;
private SimpleName name;
public Term(String value) {
this.value = value;
try {
// check for hidden dates. CQL does not support ISO dates.
this.datevalue = parseDateISO(value);
this.value = null;
} catch (Exception e) {
}
}
public Term(Identifier identifier) {
this.identifier = identifier;
}
public Term(SimpleName name) {
this.name = name;
}
public Term(Long value) {
this.longvalue = value;
}
public Term(Double value) {
this.doublevalue = value;
}
/**
* Set value, useful for inline replacements
* in spellcheck suggestions
*
* @param value the value
*/
public void setValue(String value) {
this.value = value;
}
/**
* If the value is a String it is embedded in quotation marks.
* If its a Integer or a Double it is returned without
* quotation marks.
*
* @return the value as String
*/
public String getValue() {
return longvalue != null ? Long.toString(longvalue)
: doublevalue != null ? Double.toString(doublevalue)
: value != null ? value
: identifier != null ? identifier.toString()
: name != null ? name.toString()
: null;
}
public boolean isLong() {
return longvalue != null;
}
public boolean isFloat() {
return doublevalue != null;
}
public boolean isString() {
return value != null;
}
public boolean isName() {
return name != null;
}
public boolean isIdentifier() {
return identifier != null;
}
public boolean isDate() {
return datevalue != null;
}
public void accept(Visitor visitor) {
visitor.visit(this);
}
private Date parseDateISO(String value) {
if (value == null) {
return null;
}
SimpleDateFormat sdf = new SimpleDateFormat();
sdf.applyPattern(ISO_FORMAT_SECONDS);
sdf.setTimeZone(tz);
sdf.setLenient(true);
try {
return sdf.parse(value);
} catch (ParseException pe) {
// skip
}
sdf.applyPattern(ISO_FORMAT_DAYS);
try {
return sdf.parse(value);
} catch (ParseException pe) {
return null;
}
}
private String formatDateISO(Date date) {
if (date == null) {
return null;
}
SimpleDateFormat sdf = new SimpleDateFormat();
sdf.applyPattern(ISO_FORMAT_SECONDS);
sdf.setTimeZone(tz);
return sdf.format(date);
}
@Override
public String toString() {
return longvalue != null ? Long.toString(longvalue)
: doublevalue != null ? Double.toString(doublevalue)
: datevalue != null ? formatDateISO(datevalue)
: value != null ? value.startsWith("\"") && value.endsWith("\"") ? value
: "\"" + value.replaceAll("\"", "\\\\\"") + "\""
: identifier != null ? identifier.toString()
: name != null ? name.toString()
: null;
}
}

View file

@ -0,0 +1,38 @@
package org.xbib.cql;
/**
* CQL abstract syntax tree visitor.
*/
public interface Visitor {
void visit(SortedQuery node);
void visit(Query node);
void visit(PrefixAssignment node);
void visit(ScopedClause node);
void visit(BooleanGroup node);
void visit(SearchClause node);
void visit(Relation node);
void visit(Modifier node);
void visit(ModifierList node);
void visit(Term node);
void visit(Identifier node);
void visit(Index node);
void visit(SimpleName node);
void visit(SortSpec node);
void visit(SingleSpec node);
}

View file

@ -0,0 +1,349 @@
package org.xbib.cql.elasticsearch;
import org.xbib.content.XContentBuilder;
import org.xbib.cql.util.DateUtil;
import org.xbib.cql.BooleanGroup;
import org.xbib.cql.BooleanOperator;
import org.xbib.cql.Comparitor;
import org.xbib.cql.Identifier;
import org.xbib.cql.Index;
import org.xbib.cql.ModifierList;
import org.xbib.cql.PrefixAssignment;
import org.xbib.cql.Query;
import org.xbib.cql.Relation;
import org.xbib.cql.ScopedClause;
import org.xbib.cql.SearchClause;
import org.xbib.cql.SimpleName;
import org.xbib.cql.SingleSpec;
import org.xbib.cql.SortSpec;
import org.xbib.cql.SortedQuery;
import org.xbib.cql.SyntaxException;
import org.xbib.cql.Term;
import org.xbib.cql.Visitor;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Node;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import org.xbib.cql.elasticsearch.ast.TokenType;
import org.xbib.cql.elasticsearch.model.ElasticsearchQueryModel;
import java.io.IOException;
import java.util.Collection;
import java.util.Stack;
/**
* Generate Elasticsearch filter query from CQL abstract syntax tree.
*/
public class ElasticsearchFilterGenerator implements Visitor {
private final ElasticsearchQueryModel model;
private Stack<Node> stack;
private FilterGenerator filterGen;
public ElasticsearchFilterGenerator() {
this(new ElasticsearchQueryModel());
}
public ElasticsearchFilterGenerator(ElasticsearchQueryModel model) {
this.model = model;
this.stack = new Stack<>();
try {
this.filterGen = new FilterGenerator();
} catch (IOException e) {
// ignore
}
}
public void addOrFilter(String filterKey, Collection<String> filterValues) {
for (String value : filterValues) {
model.addDisjunctiveFilter(filterKey, new Expression(Operator.OR_FILTER, new Name(filterKey), new Token(value)), Operator.OR);
}
}
public void addAndFilter(String filterKey, Collection<String> filterValues) {
for (String value : filterValues) {
model.addConjunctiveFilter(filterKey, new Expression(Operator.AND_FILTER, new Name(filterKey), new Token(value)), Operator.AND);
}
}
public XContentBuilder getResult() throws IOException {
return filterGen.getResult();
}
@Override
public void visit(SortedQuery node) {
try {
filterGen.start();
node.getQuery().accept(this);
Node querynode = stack.pop();
if (querynode instanceof Token) {
filterGen.visit(new Expression(Operator.TERM_FILTER, new Name("cql.allIndexes"), querynode));
} else if (querynode instanceof Expression) {
filterGen.visit(new Expression(Operator.QUERY_FILTER, (Expression) querynode));
}
if (model.hasFilter()) {
filterGen.visit(model.getFilterExpression());
}
filterGen.end();
} catch (IOException e) {
throw new SyntaxException("unable to build a valid query from " + node + ", reason: " + e.getMessage(), e);
}
}
@Override
public void visit(SortSpec node) {
if (node.getSingleSpec() != null) {
node.getSingleSpec().accept(this);
}
if (node.getSortSpec() != null) {
node.getSortSpec().accept(this);
}
}
@Override
public void visit(SingleSpec node) {
if (node.getIndex() != null) {
node.getIndex().accept(this);
}
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
if (!stack.isEmpty()) {
model.setSort(stack);
}
}
@Override
public void visit(Query node) {
for (PrefixAssignment assignment : node.getPrefixAssignments()) {
assignment.accept(this);
}
if (node.getScopedClause() != null) {
node.getScopedClause().accept(this);
}
}
@Override
public void visit(PrefixAssignment node) {
node.getPrefix().accept(this);
node.getURI().accept(this);
}
@Override
public void visit(ScopedClause node) {
if (node.getScopedClause() != null) {
node.getScopedClause().accept(this);
}
node.getSearchClause().accept(this);
if (node.getBooleanGroup() != null) {
node.getBooleanGroup().accept(this);
}
// evaluate expression
if (!stack.isEmpty() && stack.peek() instanceof Operator) {
Operator op = (Operator) stack.pop();
if (!stack.isEmpty()) {
Node esnode = stack.pop();
// add default context if node is a literal without a context
if (esnode instanceof Token && TokenType.STRING.equals(esnode.getType())) {
esnode = new Expression(Operator.ALL, new Name("cql.allIndexes"), esnode);
}
if (stack.isEmpty()) {
// unary expression
throw new IllegalArgumentException("unary expression not allowed, op=" + op + " node=" + esnode);
} else {
// binary expression
Node esnode2 = stack.pop();
// add default context if node is a literal without context
if (esnode2 instanceof Token && TokenType.STRING.equals(esnode2.getType())) {
esnode2 = new Expression(Operator.ALL, new Name("cql.allIndexes"), esnode2);
}
esnode = new Expression(op, esnode2, esnode);
}
stack.push(esnode);
}
}
}
@Override
public void visit(SearchClause node) {
if (node.getQuery() != null) {
// CQL query in parenthesis
node.getQuery().accept(this);
}
if (node.getTerm() != null) {
node.getTerm().accept(this);
}
if (node.getIndex() != null) {
node.getIndex().accept(this);
}
if (node.getRelation() != null) {
node.getRelation().accept(this);
if (node.getRelation().getModifierList() != null && node.getIndex() != null) {
// stack layout: op, list of modifiers, modifiable index
Node op = stack.pop();
StringBuilder sb = new StringBuilder();
Node modifier = stack.pop();
while (modifier instanceof Modifier) {
if (sb.length() > 0) {
sb.append('.');
}
sb.append(modifier.toString());
modifier = stack.pop();
}
String modifiable = sb.toString();
stack.push(new Name(modifiable));
stack.push(op);
}
}
// evaluate expression
if (!stack.isEmpty() && stack.peek() instanceof Operator) {
Operator op = (Operator) stack.pop();
Node arg1 = stack.pop();
Node arg2 = stack.pop();
// fold two expressions if they have the same operator
boolean fold = arg1.isVisible() && arg2.isVisible()
&& arg2 instanceof Expression
&& ((Expression) arg2).getOperator().equals(op);
Expression expression = fold ? new Expression((Expression) arg2, arg1) : new Expression(op, arg1, arg2);
stack.push(expression);
}
}
@Override
public void visit(BooleanGroup node) {
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
stack.push(booleanToES(node.getOperator()));
}
@Override
public void visit(Relation node) {
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
stack.push(comparitorToES(node.getComparitor()));
}
@Override
public void visit(ModifierList node) {
for (org.xbib.cql.Modifier modifier : node.getModifierList()) {
modifier.accept(this);
}
}
@Override
public void visit(org.xbib.cql.Modifier node) {
Node term = null;
if (node.getTerm() != null) {
node.getTerm().accept(this);
term = stack.pop();
}
node.getName().accept(this);
Node name = stack.pop();
stack.push(new Modifier(name, term));
}
@Override
public void visit(Term node) {
stack.push(termToES(node));
}
@Override
public void visit(Identifier node) {
stack.push(new Name(node.getValue()));
}
@Override
public void visit(Index node) {
String context = node.getContext();
String name = context != null ? context + "." + node.getName() : node.getName();
Name esname = new Name(name, model.getVisibility(context));
esname.setType(model.getElasticsearchType(name));
stack.push(esname);
}
@Override
public void visit(SimpleName node) {
stack.push(new Name(node.getName()));
}
private Node termToES(Term node) {
if (node.isLong()) {
return new Token(Long.parseLong(node.getValue()));
} else if (node.isFloat()) {
return new Token(Double.parseDouble(node.getValue()));
} else if (node.isIdentifier()) {
return new Token(node.getValue());
} else if (node.isDate()) {
return new Token(DateUtil.parseDateISO(node.getValue()));
} else if (node.isString()) {
return new Token(node.getValue());
}
return null;
}
private Operator booleanToES(BooleanOperator bop) {
Operator op;
switch (bop) {
case AND:
op = Operator.AND;
break;
case OR:
op = Operator.OR;
break;
case NOT:
op = Operator.ANDNOT;
break;
case PROX:
op = Operator.PROX;
break;
default:
throw new IllegalArgumentException("unknown CQL operator: " + bop);
}
return op;
}
private Operator comparitorToES(Comparitor op) {
Operator esop;
switch (op) {
case EQUALS:
esop = Operator.EQUALS;
break;
case GREATER:
esop = Operator.RANGE_GREATER_THAN;
break;
case GREATER_EQUALS:
esop = Operator.RANGE_GREATER_OR_EQUAL;
break;
case LESS:
esop = Operator.RANGE_LESS_THAN;
break;
case LESS_EQUALS:
esop = Operator.RANGE_LESS_OR_EQUALS;
break;
case NOT_EQUALS:
esop = Operator.NOT_EQUALS;
break;
case WITHIN:
esop = Operator.RANGE_WITHIN;
break;
case ADJ:
esop = Operator.PHRASE;
break;
case ALL:
esop = Operator.ALL;
break;
case ANY:
esop = Operator.ANY;
break;
default:
throw new IllegalArgumentException("unknown CQL comparitor: " + op);
}
return esop;
}
}

View file

@ -0,0 +1,492 @@
package org.xbib.cql.elasticsearch;
import org.xbib.content.XContentBuilder;
import org.xbib.cql.BooleanGroup;
import org.xbib.cql.BooleanOperator;
import org.xbib.cql.CQLParser;
import org.xbib.cql.Comparitor;
import org.xbib.cql.Identifier;
import org.xbib.cql.Index;
import org.xbib.cql.ModifierList;
import org.xbib.cql.PrefixAssignment;
import org.xbib.cql.Query;
import org.xbib.cql.Relation;
import org.xbib.cql.ScopedClause;
import org.xbib.cql.SearchClause;
import org.xbib.cql.SimpleName;
import org.xbib.cql.SingleSpec;
import org.xbib.cql.SortSpec;
import org.xbib.cql.SortedQuery;
import org.xbib.cql.SyntaxException;
import org.xbib.cql.Term;
import org.xbib.cql.Visitor;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Node;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import org.xbib.cql.elasticsearch.ast.TokenType;
import org.xbib.cql.elasticsearch.model.ElasticsearchQueryModel;
import org.xbib.cql.util.DateUtil;
import java.io.IOException;
import java.util.Collection;
import java.util.Stack;
/**
* Generate Elasticsearch QueryModel DSL from CQL abstract syntax tree
*/
public class ElasticsearchQueryGenerator implements Visitor {
private ElasticsearchQueryModel model;
private ElasticsearchFilterGenerator filterGenerator;
private Stack<Node> stack;
private int from;
private int size;
private String boostField;
private String modifier;
private Float factor;
private String boostMode;
private SourceGenerator sourceGen;
private QueryGenerator queryGen;
private FilterGenerator filterGen;
private FacetsGenerator facetGen;
private XContentBuilder sort;
public ElasticsearchQueryGenerator() {
this.from = 0;
this.size = 10;
this.model = new ElasticsearchQueryModel();
this.filterGenerator = new ElasticsearchFilterGenerator(model);
this.stack = new Stack<>();
try {
this.sourceGen = new SourceGenerator();
this.queryGen = new QueryGenerator();
this.filterGen = new FilterGenerator();
this.facetGen = new FacetsGenerator();
} catch (IOException e) {
// ignore
}
}
public ElasticsearchQueryModel getModel() {
return model;
}
public ElasticsearchQueryGenerator setFrom(int from) {
this.from = from;
return this;
}
public ElasticsearchQueryGenerator setSize(int size) {
this.size = size;
return this;
}
public ElasticsearchQueryGenerator setSort(XContentBuilder sort) {
this.sort = sort;
return this;
}
public ElasticsearchQueryGenerator setBoostParams(String boostField, String modifier, Float factor, String boostMode) {
this.boostField = boostField;
this.modifier = modifier;
this.factor = factor;
this.boostMode = boostMode;
return this;
}
public ElasticsearchQueryGenerator filter(String filter) {
CQLParser parser = new CQLParser(filter);
parser.parse();
parser.getCQLQuery().accept(filterGenerator);
return this;
}
public ElasticsearchQueryGenerator andfilter(String filterKey, Collection<String> filterValues) {
filterGenerator.addAndFilter(filterKey, filterValues);
return this;
}
public ElasticsearchQueryGenerator orfilter(String filterKey, Collection<String> filterValues) {
filterGenerator.addOrFilter(filterKey, filterValues);
return this;
}
public ElasticsearchQueryGenerator facet(String facetLimit, String facetSort) {
try {
facetGen.facet(facetLimit, facetSort);
} catch (IOException e) {
// ignore
}
return this;
}
public String getQueryResult() {
return queryGen.getResult().string();
}
public String getFacetResult() {
try {
return facetGen.getResult().string();
} catch (IOException e) {
return e.getMessage();
}
}
public String getSourceResult() {
return sourceGen.getResult().string();
}
@Override
public void visit(SortedQuery node) {
try {
if (node.getSortSpec() != null) {
node.getSortSpec().accept(this);
}
queryGen.start();
node.getQuery().accept(this);
if (boostField != null) {
queryGen.startBoost(boostField, modifier, factor, boostMode);
}
if (model.hasFilter()) {
queryGen.startFiltered();
} else if (filterGenerator.getResult().bytes().length() > 0) {
queryGen.startFiltered();
}
Node querynode = stack.pop();
if (querynode instanceof Token) {
Token token = (Token) querynode;
querynode = ".".equals(token.getString()) ?
new Expression(Operator.MATCH_ALL) :
new Expression(Operator.EQUALS, new Name("cql.allIndexes"), querynode);
}
queryGen.visit((Expression) querynode);
if (model.hasFilter()) {
queryGen.end();
filterGen = new FilterGenerator(queryGen);
filterGen.startFilter();
filterGen.visit(model.getFilterExpression());
filterGen.endFilter();
queryGen.end();
} else if (filterGenerator.getResult().bytes().length() > 0) {
queryGen.end();
queryGen.getResult().rawField("filter", filterGenerator.getResult().bytes().toBytes());
queryGen.endFiltered();
}
if (boostField != null) {
queryGen.endBoost();
}
if (model.hasFacets()) {
facetGen = new FacetsGenerator();
facetGen.visit(model.getFacetExpression());
}
queryGen.end();
Expression sortnode = model.getSort();
SortGenerator sortGen = new SortGenerator();
if (sortnode != null) {
sortGen.start();
sortGen.visit(sortnode);
sortGen.end();
sort = sortGen.getResult();
}
sourceGen.build(queryGen, from, size, sort, facetGen.getResult());
} catch (IOException e) {
throw new SyntaxException("unable to build a valid query from " + node + " , reason: " + e.getMessage(), e);
}
}
@Override
public void visit(SortSpec node) {
if (node.getSingleSpec() != null) {
node.getSingleSpec().accept(this);
}
if (node.getSortSpec() != null) {
node.getSortSpec().accept(this);
}
}
@Override
public void visit(SingleSpec node) {
if (node.getIndex() != null) {
node.getIndex().accept(this);
}
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
if (!stack.isEmpty()) {
model.setSort(stack);
}
}
@Override
public void visit(Query node) {
for (PrefixAssignment assignment : node.getPrefixAssignments()) {
assignment.accept(this);
}
if (node.getScopedClause() != null) {
node.getScopedClause().accept(this);
}
}
@Override
public void visit(PrefixAssignment node) {
node.getPrefix().accept(this);
node.getURI().accept(this);
}
@Override
public void visit(ScopedClause node) {
if (node.getScopedClause() != null) {
node.getScopedClause().accept(this);
}
node.getSearchClause().accept(this);
if (node.getBooleanGroup() != null) {
node.getBooleanGroup().accept(this);
}
// format disjunctive or conjunctive filters
if (node.getSearchClause().getIndex() != null
&& model.isFilterContext(node.getSearchClause().getIndex().getContext())) {
// assume that each operator-less filter is a conjunctive filter
BooleanOperator op = node.getBooleanGroup() != null
? node.getBooleanGroup().getOperator() : BooleanOperator.AND;
String filtername = node.getSearchClause().getIndex().getName();
Operator filterop = comparitorToES(node.getSearchClause().getRelation().getComparitor());
Node filterterm = termToESwithoutWildCard(node.getSearchClause().getTerm());
if (op == BooleanOperator.AND) {
model.addConjunctiveFilter(filtername, filterterm, filterop);
} else if (op == BooleanOperator.OR) {
model.addDisjunctiveFilter(filtername, filterterm, filterop);
}
}
// evaluate expression
if (!stack.isEmpty() && stack.peek() instanceof Operator) {
Operator op = (Operator) stack.pop();
if (!stack.isEmpty()) {
Node esnode = stack.pop();
// add default context if node is a literal without a context
if (esnode instanceof Token && TokenType.STRING.equals(esnode.getType())) {
esnode = new Expression(Operator.EQUALS, new Name("cql.allIndexes"), esnode);
}
if (stack.isEmpty()) {
// unary expression
throw new IllegalArgumentException("unary expression not allowed, op=" + op + " node=" + esnode);
} else {
// binary expression
Node esnode2 = stack.pop();
// add default context if node is a literal without context
if (esnode2 instanceof Token && TokenType.STRING.equals(esnode2.getType())) {
esnode2 = new Expression(Operator.EQUALS, new Name("cql.allIndexes"), esnode2);
}
esnode = new Expression(op, esnode2, esnode);
}
stack.push(esnode);
}
}
}
@Override
public void visit(SearchClause node) {
if (node.getQuery() != null) {
// CQL query in parenthesis
node.getQuery().accept(this);
}
if (node.getTerm() != null) {
node.getTerm().accept(this);
}
if (node.getIndex() != null) {
node.getIndex().accept(this);
String context = node.getIndex().getContext();
// format facets
if (model.isFacetContext(context)) {
model.addFacet(node.getIndex().getName(), node.getTerm().getValue());
}
}
if (node.getRelation() != null) {
node.getRelation().accept(this);
if (node.getRelation().getModifierList() != null && node.getIndex() != null) {
// stack layout: op, list of modifiers, modifiable index
Node op = stack.pop();
StringBuilder sb = new StringBuilder();
Node modifier = stack.pop();
while (modifier instanceof Modifier) {
if (sb.length() > 0) {
sb.append('.');
}
sb.append(modifier.toString());
modifier = stack.pop();
}
String modifiable = sb.toString();
stack.push(new Name(modifiable));
stack.push(op);
}
}
// evaluate expression
if (!stack.isEmpty() && stack.peek() instanceof Operator) {
Operator op = (Operator) stack.pop();
Node arg1 = stack.pop();
Node arg2 = stack.pop();
// fold two expressions if they have the same operator
boolean fold = arg1.isVisible() && arg2.isVisible()
&& arg2 instanceof Expression
&& ((Expression) arg2).getOperator().equals(op);
Expression expression = fold ? new Expression((Expression) arg2, arg1) : new Expression(op, arg1, arg2);
stack.push(expression);
}
}
@Override
public void visit(BooleanGroup node) {
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
stack.push(booleanToES(node.getOperator()));
}
@Override
public void visit(Relation node) {
if (node.getModifierList() != null) {
node.getModifierList().accept(this);
}
stack.push(comparitorToES(node.getComparitor()));
}
@Override
public void visit(ModifierList node) {
for (org.xbib.cql.Modifier modifier : node.getModifierList()) {
modifier.accept(this);
}
}
@Override
public void visit(org.xbib.cql.Modifier node) {
Node term = null;
if (node.getTerm() != null) {
node.getTerm().accept(this);
term = stack.pop();
}
node.getName().accept(this);
Node name = stack.pop();
stack.push(new Modifier(name, term));
}
@Override
public void visit(Term node) {
stack.push(termToES(node));
}
@Override
public void visit(Identifier node) {
stack.push(new Name(node.getValue()));
}
@Override
public void visit(Index node) {
String context = node.getContext();
String name = context != null ? context + "." + node.getName() : node.getName();
Name esname = new Name(name, model.getVisibility(context));
esname.setType(model.getElasticsearchType(name));
stack.push(esname);
}
@Override
public void visit(SimpleName node) {
stack.push(new Name(node.getName()));
}
private Node termToES(Term node) {
if (node.isLong()) {
return new Token(Long.parseLong(node.getValue()));
} else if (node.isFloat()) {
return new Token(Double.parseDouble(node.getValue()));
} else if (node.isIdentifier()) {
return new Token(node.getValue());
} else if (node.isDate()) {
return new Token(DateUtil.parseDateISO(node.getValue()));
} else if (node.isString()) {
return new Token(node.getValue());
}
return null;
}
private Node termToESwithoutWildCard(Term node) {
return node.isString() || node.isIdentifier()
? new Token(node.getValue().replaceAll("\\*", ""))
: termToES(node);
}
private Operator booleanToES(BooleanOperator bop) {
Operator op;
switch (bop) {
case AND:
op = Operator.AND;
break;
case OR:
op = Operator.OR;
break;
case NOT:
op = Operator.ANDNOT;
break;
case PROX:
op = Operator.PROX;
break;
default:
throw new IllegalArgumentException("unknown CQL operator: " + bop);
}
return op;
}
private Operator comparitorToES(Comparitor op) {
Operator esop;
switch (op) {
case EQUALS:
esop = Operator.EQUALS;
break;
case GREATER:
esop = Operator.RANGE_GREATER_THAN;
break;
case GREATER_EQUALS:
esop = Operator.RANGE_GREATER_OR_EQUAL;
break;
case LESS:
esop = Operator.RANGE_LESS_THAN;
break;
case LESS_EQUALS:
esop = Operator.RANGE_LESS_OR_EQUALS;
break;
case NOT_EQUALS:
esop = Operator.NOT_EQUALS;
break;
case WITHIN:
esop = Operator.RANGE_WITHIN;
break;
case ADJ:
esop = Operator.PHRASE;
break;
case ALL:
esop = Operator.ALL;
break;
case ANY:
esop = Operator.ANY;
break;
default:
throw new IllegalArgumentException("unknown CQL comparitor: " + op);
}
return esop;
}
}

View file

@ -0,0 +1,177 @@
package org.xbib.cql.elasticsearch;
import static org.xbib.content.json.JsonXContent.contentBuilder;
import org.xbib.content.XContentBuilder;
import org.xbib.cql.SyntaxException;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
/**
* Build facet from abstract syntax tree
*/
public class FacetsGenerator implements Visitor {
private int facetlength = 10;
private final XContentBuilder builder;
public FacetsGenerator() throws IOException {
this.builder = contentBuilder();
}
public void start() throws IOException {
builder.startObject();
}
public void end() throws IOException {
builder.endObject();
}
public void startFacets() throws IOException {
builder.startObject("aggregations");
}
public void endFacets() throws IOException {
builder.endObject();
}
public XContentBuilder getResult() throws IOException {
return builder;
}
@Override
public void visit(Token node) {
try {
builder.value(node.toString().getBytes());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Name node) {
try {
builder.value(node.toString().getBytes());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Modifier node) {
try {
builder.value(node.toString().getBytes());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Operator node) {
try {
builder.value(node.toString().getBytes());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Expression node) {
try {
Operator op = node.getOperator();
switch (op) {
case TERMS_FACET: {
builder.startObject().field("myfacet", "myvalue")
.endObject();
break;
}
default:
throw new IllegalArgumentException(
"unable to translate operator while building elasticsearch facet: " + op);
}
} catch (IOException e) {
throw new SyntaxException("internal error while building elasticsearch query", e);
}
}
public FacetsGenerator facet(String facetLimit, String facetSort) throws IOException {
if (facetLimit == null) {
return this;
}
Map<String, Integer> facetMap = parseFacet(facetLimit);
String[] sortSpec = facetSort != null ? facetSort.split(",") : new String[]{"recordCount", "descending"};
String order = "_count";
String dir = "desc";
for (String s : sortSpec) {
switch (s) {
case "recordCount":
order = "_count";
break;
case "alphanumeric":
order = "_term";
break;
case "ascending":
dir = "asc";
break;
}
}
builder.startObject();
for (String index : facetMap.keySet()) {
if ("*".equals(index)) {
continue;
}
// TODO range aggregations etc.
String facetType = "terms";
Integer size = facetMap.get(index);
builder.field(index)
.startObject()
.field(facetType).startObject()
.field("field", index)
.field("size", size > 0 ? size : 10)
.startObject("order")
.field(order, dir)
.endObject()
.endObject();
builder.endObject();
}
builder.endObject();
return this;
}
private Map<String, Integer> parseFacet(String spec) {
Map<String, Integer> m = new HashMap<String, Integer>();
m.put("*", facetlength);
if (spec == null || spec.length() == 0) {
return m;
}
String[] params = spec.split(",");
for (String param : params) {
int pos = param.indexOf(':');
if (pos > 0) {
int n = parseInt(param.substring(0, pos), facetlength);
m.put(param.substring(pos + 1), n);
} else if (param.length() > 0) {
int n = parseInt(param, facetlength);
m.put("*", n);
}
}
return m;
}
private int parseInt(String s, int defaultValue) {
try {
return Integer.parseInt(s);
} catch (NumberFormatException e) {
return defaultValue;
}
}
}

View file

@ -0,0 +1,338 @@
package org.xbib.cql.elasticsearch;
import static org.xbib.content.json.JsonXContent.contentBuilder;
import org.xbib.content.XContentBuilder;
import org.xbib.cql.SyntaxException;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Node;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import org.xbib.cql.util.QuotedStringTokenizer;
import java.io.IOException;
/**
* Build query filter in Elasticsearch JSON syntax from abstract syntax tree
*/
public class FilterGenerator implements Visitor {
private XContentBuilder builder;
public FilterGenerator() throws IOException {
this.builder = contentBuilder();
}
public FilterGenerator(QueryGenerator queryGenerator) throws IOException {
this.builder = queryGenerator.getResult();
}
public FilterGenerator start() throws IOException {
builder.startObject();
return this;
}
public FilterGenerator end() throws IOException {
builder.endObject();
return this;
}
public FilterGenerator startFilter() throws IOException {
builder.startObject("filter");
return this;
}
public FilterGenerator endFilter() throws IOException {
builder.endObject();
return this;
}
public XContentBuilder getResult() throws IOException {
return builder;
}
@Override
public void visit(Token node) {
try {
builder.value(node.getString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Name node) {
try {
builder.field(node.toString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Modifier node) {
try {
builder.value(node.toString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Operator node) {
try {
builder.value(node.toString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Expression node) {
if (!node.isVisible()) {
return;
}
try {
Operator op = node.getOperator();
switch (op.getArity()) {
case 2: {
Node arg1 = node.getArg1();
Node arg2 = node.getArgs().length > 1 ? node.getArg2() : null;
boolean visible = false;
for (Node arg : node.getArgs()) {
visible = visible || arg.isVisible();
}
if (!visible) {
return;
}
Token tok2 = arg2 instanceof Token ? (Token) arg2 : null;
switch (op) {
case EQUALS: {
String field = arg1.toString();
String value = tok2 != null ? tok2.getString() : "";
builder.startObject(tok2 != null && tok2.isBoundary() ? "prefix" : "term");
builder.field(field, value).endObject();
break;
}
case NOT_EQUALS: {
String field = arg1.toString();
String value = tok2 != null ? tok2.getString() : "";
builder.startObject("not")
.startObject(tok2 != null && tok2.isBoundary() ? "prefix" : "term")
.field(field, value)
.endObject().endObject();
break;
}
case ALL: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
boolean phrase = arg2 instanceof Token && ((Token) arg2).isProtected();
if (phrase) {
builder.startArray("and");
QuotedStringTokenizer qst = new QuotedStringTokenizer(value);
while (qst.hasMoreTokens()) {
builder.startObject().startObject("term").field(field, qst.nextToken()).endObject().endObject();
}
builder.endArray();
} else {
builder.startObject(tok2 != null && tok2.isBoundary() ? "prefix" : "term")
.field(field, value)
.endObject();
}
break;
}
case ANY: {
boolean phrase = arg2 instanceof Token && ((Token) arg2).isProtected();
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
if (phrase) {
builder.startArray("or");
QuotedStringTokenizer qst = new QuotedStringTokenizer(value);
while (qst.hasMoreTokens()) {
builder.startObject().startObject("term")
.field(field, qst.nextToken()).endObject().endObject();
}
builder.endArray();
} else {
builder.startObject(tok2 != null && tok2.isBoundary() ? "prefix" : "term")
.field(field, value)
.endObject();
}
break;
}
case RANGE_GREATER_THAN: {
String field = arg1.toString();
String value = tok2 != null ? tok2.getString() : "";
builder.startObject("range").startObject(field)
.field("from", value)
.field("include_lower", false)
.endObject().endObject();
break;
}
case RANGE_GREATER_OR_EQUAL: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("from", value)
.field("include_lower", true)
.endObject().endObject();
break;
}
case RANGE_LESS_THAN: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("to", value)
.field("include_upper", false)
.endObject().endObject();
break;
}
case RANGE_LESS_OR_EQUALS: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("to", value)
.field("include_upper", true)
.endObject().endObject();
break;
}
case RANGE_WITHIN: {
String field = arg1.toString();
String value = tok2 != null ? tok2.getString() : "";
String[] s = value.split(" ");
builder.startObject("range").startObject(field).
field("from", s[0])
.field("to", s[1])
.field("include_lower", true)
.field("include_upper", true)
.endObject().endObject();
break;
}
case AND: {
if (arg2 == null) {
if (arg1.isVisible()) {
arg1.accept(this);
}
} else {
builder.startObject("bool");
builder.startArray("must");
Node[] args = node.getArgs();
for (int i = 0; i < node.getArgs().length; i++) {
if (args[i].isVisible()) {
builder.startObject();
args[i].accept(this);
builder.endObject();
}
}
builder.endArray();
builder.endObject();
}
break;
}
case OR: {
if (arg2 == null) {
if (arg1.isVisible()) {
arg1.accept(this);
}
} else {
builder.startObject("bool");
builder.startArray("should");
Node[] args = node.getArgs();
for (int i = 0; i < node.getArgs().length; i++) {
if (args[i].isVisible()) {
builder.startObject();
args[i].accept(this);
builder.endObject();
}
}
builder.endArray();
builder.endObject();
}
break;
}
case OR_FILTER: {
builder.startObject("bool");
builder.startArray("should");
Node[] args = node.getArgs();
for (int i = 0; i < args.length; i += 2) {
if (args[i].isVisible()) {
builder.startObject().startObject("term");
args[i].accept(this);
args[i + 1].accept(this);
builder.endObject().endObject();
}
}
builder.endArray();
builder.endObject();
break;
}
case AND_FILTER: {
builder.startObject("bool");
builder.startArray("must");
Node[] args = node.getArgs();
for (int i = 0; i < args.length; i += 2) {
if (args[i].isVisible()) {
builder.startObject().startObject("term");
args[i].accept(this);
args[i + 1].accept(this);
builder.endObject().endObject();
}
}
builder.endArray();
builder.endObject();
break;
}
case ANDNOT: {
if (arg2 == null) {
if (arg1.isVisible()) {
arg1.accept(this);
}
} else {
builder.startObject("bool");
builder.startArray("must_not");
Node[] args = node.getArgs();
for (int i = 0; i < node.getArgs().length; i++) {
if (args[i].isVisible()) {
builder.startObject();
args[i].accept(this);
builder.endObject();
}
}
builder.endArray();
builder.endObject();
}
break;
}
case PROX: {
String field = arg1.toString();
// we assume a default of 10 words is enough for proximity
String value = arg2 != null ? arg2.toString() + "~10" : "";
builder.startObject("field").field(field, value).endObject();
break;
}
case TERM_FILTER: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("term").field(field, value).endObject();
break;
}
case QUERY_FILTER: {
builder.startObject("query");
arg1.accept(this);
builder.endObject();
break;
}
default:
throw new IllegalArgumentException("unable to translate operator while building elasticsearch query filter: " + op);
}
break;
}
}
} catch (IOException e) {
throw new SyntaxException("internal error while building elasticsearch query filter", e);
}
}
}

View file

@ -0,0 +1,381 @@
package org.xbib.cql.elasticsearch;
import static org.xbib.content.json.JsonXContent.contentBuilder;
import org.xbib.content.XContentBuilder;
import org.xbib.cql.SyntaxException;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Node;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import java.io.IOException;
/**
* Build Elasticsearch query from abstract syntax tree
*/
public class QueryGenerator implements Visitor {
private final XContentBuilder builder;
public QueryGenerator() throws IOException {
this.builder = contentBuilder();
}
public void start() throws IOException {
builder.startObject();
}
public void end() throws IOException {
builder.endObject();
}
public void startFiltered() throws IOException {
builder.startObject("filtered").startObject("query");
}
public void endFiltered() throws IOException {
builder.endObject();
}
public void startBoost(String boostField, String modifier, Float factor, String boostMode) throws IOException {
builder.startObject("function_score")
.startObject("field_value_factor")
.field("field", boostField)
.field("modifier", modifier != null ? modifier : "log1p")
.field("factor", factor != null ? factor : 1.0f)
.endObject()
.field("boost_mode", boostMode != null ? boostMode : "multiply")
.startObject("query");
}
public void endBoost() throws IOException {
builder.endObject().endObject();
}
public XContentBuilder getResult() {
return builder;
}
@Override
public void visit(Token token) {
try {
switch (token.getType()) {
case BOOL:
builder.value(token.getBoolean());
break;
case INT:
builder.value(token.getInteger());
break;
case FLOAT:
builder.value(token.getFloat());
break;
case DATETIME:
builder.value(token.getDate());
break;
case STRING:
builder.value(token.getString());
break;
default:
throw new IOException("unknown token type: " + token);
}
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Name node) {
try {
builder.field(node.toString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Modifier node) {
try {
builder.value(node.toString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Operator node) {
try {
builder.value(node.toString());
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Expression node) {
if (!node.isVisible()) {
return;
}
try {
Operator op = node.getOperator();
switch (op.getArity()) {
case 0: {
switch (op) {
case MATCH_ALL: {
builder.startObject("match_all").endObject();
break;
}
}
break;
}
case 1: {
// unary operators, anyone?
break;
}
case 2: {
// binary operators
Node arg1 = node.getArg1();
Node arg2 = node.getArgs().length > 1 ? node.getArg2() : null;
Token tok2 = arg2 instanceof Token ? (Token) arg2 : null;
boolean visible = false;
for (Node arg : node.getArgs()) {
visible = visible || arg.isVisible();
}
if (!visible) {
return;
}
switch (op) {
case EQUALS: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("simple_query_string")
.field("query", value)
.field("fields", new String[]{field})
.field("analyze_wildcard", true)
.field("default_operator", "and")
.endObject();
break;
}
case NOT_EQUALS: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("bool").startObject("must_not");
builder.startObject("simple_query_string")
.field("query", value)
.field("fields", new String[]{field})
.field("analyze_wildcard", true)
.field("default_operator", "and")
.endObject();
builder.endObject().endObject();
break;
}
case ALL: {
String field = arg1.toString();
String value = tok2 != null ? tok2.getString() : "";
builder.startObject("simple_query_string")
.field("query", value)
.field("fields", new String[]{field})
.field("analyze_wildcard", true)
.field("default_operator", "and")
.endObject();
break;
}
case ANY: {
String field = arg1.toString();
String value = tok2 != null ? tok2.getString() : "";
builder.startObject("simple_query_string")
.field("query", value)
.field("fields", new String[]{field})
.field("analyze_wildcard", true)
.field("default_operator", "or")
.endObject();
break;
}
case PHRASE: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
if (tok2 != null) {
if (tok2.isProtected()) {
builder.startObject("match_phrase")
.startObject(field)
.field("query", tok2.getString())
.field("slop", 0)
.endObject()
.endObject();
} else if (tok2.isAll()) {
builder.startObject("match_all").endObject();
} else if (tok2.isWildcard()) {
builder.startObject("wildcard").field(field, value).endObject();
} else if (tok2.isBoundary()) {
builder.startObject("prefix").field(field, value).endObject();
} else {
builder.startObject("match_phrase")
.startObject(field)
.field("query", value)
.field("slop", 0)
.endObject()
.endObject();
}
}
break;
}
case RANGE_GREATER_THAN: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("from", value)
.field("include_lower", false)
.endObject().endObject();
break;
}
case RANGE_GREATER_OR_EQUAL: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("from", value)
.field("include_lower", true)
.endObject().endObject();
break;
}
case RANGE_LESS_THAN: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("to", value)
.field("include_upper", false)
.endObject().endObject();
break;
}
case RANGE_LESS_OR_EQUALS: {
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
builder.startObject("range").startObject(field)
.field("to", value)
.field("include_upper", true)
.endObject().endObject();
break;
}
case RANGE_WITHIN: {
// borders are inclusive
String field = arg1.toString();
String value = arg2 != null ? arg2.toString() : "";
String from = null;
String to = null;
if (tok2 != null) {
if (!tok2.isProtected()) {
throw new IllegalArgumentException("range within: unable to derive range from a non-phrase: " + value);
}
if (tok2.getStringList().size() != 2) {
throw new IllegalArgumentException("range within: unable to derive range from a phrase of length not equals to 2: " + tok2.getStringList());
}
from = tok2.getStringList().get(0);
to = tok2.getStringList().get(1);
}
builder.startObject("range").startObject(field)
.field("from", from)
.field("to", to)
.field("include_lower", true)
.field("include_upper", true)
.endObject().endObject();
break;
}
case AND: {
if (arg2 == null) {
if (arg1.isVisible()) {
arg1.accept(this);
}
} else {
builder.startObject("bool");
if (arg1.isVisible() && arg2.isVisible()) {
builder.startArray("must").startObject();
arg1.accept(this);
builder.endObject().startObject();
arg2.accept(this);
builder.endObject().endArray();
} else if (arg1.isVisible()) {
builder.startObject("must");
arg1.accept(this);
builder.endObject();
} else if (arg2.isVisible()) {
builder.startObject("must");
arg2.accept(this);
builder.endObject();
}
builder.endObject();
}
break;
}
case OR: {
// short expression
if (arg2 == null) {
if (arg1.isVisible()) {
arg1.accept(this);
}
} else {
builder.startObject("bool");
if (arg1.isVisible() && arg2.isVisible()) {
builder.startArray("should").startObject();
arg1.accept(this);
builder.endObject().startObject();
arg2.accept(this);
builder.endObject().endArray();
} else if (arg1.isVisible()) {
builder.startObject("should");
arg1.accept(this);
builder.endObject();
} else if (arg2.isVisible()) {
builder.startObject("should");
arg2.accept(this);
builder.endObject();
}
builder.endObject();
}
break;
}
case ANDNOT: {
if (arg2 == null) {
if (arg1.isVisible()) {
arg1.accept(this);
}
} else {
builder.startObject("bool");
if (arg1.isVisible() && arg2.isVisible()) {
builder.startArray("must_not").startObject();
arg1.accept(this);
builder.endObject().startObject();
arg2.accept(this);
builder.endObject().endArray();
} else if (arg1.isVisible()) {
builder.startObject("must_not");
arg1.accept(this);
builder.endObject();
} else if (arg2.isVisible()) {
builder.startObject("must_not");
arg2.accept(this);
builder.endObject();
}
builder.endObject();
}
break;
}
case PROX: {
String field = arg1.toString();
// we assume a default of 10 words is enough for proximity
String value = arg2 != null ? arg2.toString() + "~10" : "";
builder.startObject("field").field(field, value).endObject();
break;
}
default:
throw new IllegalArgumentException("unable to translate operator while building elasticsearch query: " + op);
}
break;
}
}
} catch (IOException e) {
throw new SyntaxException("internal error while building elasticsearch query", e);
}
}
}

View file

@ -0,0 +1,109 @@
package org.xbib.cql.elasticsearch;
import static org.xbib.content.json.JsonXContent.contentBuilder;
import org.xbib.content.XContentBuilder;
import org.xbib.cql.SyntaxException;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Node;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import java.io.IOException;
import java.util.Stack;
/**
* Build sort in Elasticsearch JSON syntax from abstract syntax tree
*/
public class SortGenerator implements Visitor {
private final XContentBuilder builder;
private final Stack<Modifier> modifiers;
public SortGenerator() throws IOException {
this.builder = contentBuilder();
this.modifiers = new Stack<>();
}
public void start() throws IOException {
builder.startArray();
}
public void end() throws IOException {
builder.endArray();
}
public XContentBuilder getResult() {
return builder;
}
@Override
public void visit(Token node) {
}
@Override
public void visit(Name node) {
try {
if (modifiers.isEmpty()) {
builder.startObject()
.field(node.getName())
.startObject()
.field("unmapped_type", "string")
.field("missing", "_last")
.endObject()
.endObject();
} else {
builder.startObject().field(node.getName()).startObject();
while (!modifiers.isEmpty()) {
Modifier mod = modifiers.pop();
String s = mod.getName().toString();
switch (s) {
case "ascending":
case "sort.ascending": {
builder.field("order", "asc");
break;
}
case "descending":
case "sort.descending": {
builder.field("order", "desc");
break;
}
default: {
builder.field(s, mod.getTerm());
break;
}
}
}
builder.field("unmapped_type", "string");
builder.field("missing", "_last");
builder.endObject();
builder.endObject();
}
} catch (IOException e) {
throw new SyntaxException(e.getMessage(), e);
}
}
@Override
public void visit(Modifier node) {
modifiers.push(node);
}
@Override
public void visit(Operator node) {
}
@Override
public void visit(Expression node) {
Operator op = node.getOperator();
if (op == Operator.SORT) {
for (Node arg : node.getArgs()) {
arg.accept(this);
}
}
}
}

View file

@ -0,0 +1,43 @@
package org.xbib.cql.elasticsearch;
import static org.xbib.content.json.JsonXContent.contentBuilder;
import org.xbib.content.XContentBuilder;
import java.io.IOException;
/**
*
*/
public class SourceGenerator {
private final XContentBuilder builder;
public SourceGenerator() throws IOException {
this.builder = contentBuilder();
}
public void build(QueryGenerator query,
int from, int size) throws IOException {
build(query, from, size, null, null);
}
public void build(QueryGenerator query, int from, int size, XContentBuilder sort, XContentBuilder facets) throws IOException {
builder.startObject();
builder.field("from", from);
builder.field("size", size);
builder.rawField("query", query.getResult().bytes().toBytes() );
if (sort != null && sort.bytes().length() > 0) {
builder.rawField("sort", sort.bytes().toBytes());
}
if (facets != null && facets.bytes().length() > 0) {
builder.rawField("aggregations", facets.bytes().toBytes());
}
builder.endObject();
builder.close();
}
public XContentBuilder getResult() {
return builder;
}
}

View file

@ -0,0 +1,24 @@
package org.xbib.cql.elasticsearch;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Modifier;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
/**
*
*/
public interface Visitor {
void visit(Token node);
void visit(Name node);
void visit(Modifier node);
void visit(Operator node);
void visit(Expression node);
}

View file

@ -0,0 +1,110 @@
package org.xbib.cql.elasticsearch.ast;
import org.xbib.cql.elasticsearch.Visitor;
/**
* Elasticsearch expression
*/
public class Expression implements Node {
private Operator op;
private Node[] args;
private TokenType type;
private boolean visible;
/**
* Constructor for folding nodes.
*
* @param expr the expression
* @param arg the new argument
*/
public Expression(Expression expr, Node arg) {
this.type = TokenType.EXPRESSION;
this.op = expr.getOperator();
if (arg instanceof Expression) {
Expression expr2 = (Expression) arg;
this.args = new Node[expr.getArgs().length + expr2.getArgs().length];
System.arraycopy(expr.getArgs(), 0, this.args, 0, expr.getArgs().length);
System.arraycopy(expr2.getArgs(), 0, this.args, expr.getArgs().length, expr2.getArgs().length);
} else {
Node[] exprargs = expr.getArgs();
this.args = new Node[exprargs.length + 1];
// to avoid copy, organization of the argument list is reverse, the most recent arg is at position 0
this.args[0] = arg;
System.arraycopy(exprargs, 0, this.args, 1, exprargs.length);
}
this.visible = false;
for (Node node : args) {
if (node instanceof Name || node instanceof Expression) {
this.visible = visible || arg.isVisible();
}
}
}
public Expression(Operator op, Node... args) {
this.op = op;
this.type = TokenType.EXPRESSION;
this.args = args;
if (args != null && args.length > 0) {
this.visible = false;
for (Node arg : args) {
if (arg instanceof Name || arg instanceof Expression) {
this.visible = visible || arg.isVisible();
}
}
} else {
this.visible = true;
}
}
public Operator getOperator() {
return op;
}
public Node[] getArgs() {
return args;
}
public Node getArg1() {
return args[0];
}
public Node getArg2() {
return args[1];
}
@Override
public boolean isVisible() {
return visible;
}
@Override
public TokenType getType() {
return type;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
if (!visible) {
return "";
}
StringBuilder sb = new StringBuilder(op.toString());
sb.append('(');
for (int i = 0; i < args.length; i++) {
sb.append(args[i]);
if (i < args.length - 1) {
sb.append(',');
}
}
sb.append(')');
return sb.toString();
}
}

View file

@ -0,0 +1,49 @@
package org.xbib.cql.elasticsearch.ast;
import org.xbib.cql.elasticsearch.Visitor;
/**
* This is a modifier node for Elasticsearch query language
*/
public class Modifier implements Node {
private Node name;
private Node term;
public Modifier(Node name, Node term) {
this.name = name;
this.term = term;
}
public Modifier(Node name) {
this.name = name;
}
public Node getName() {
return name;
}
public Node getTerm() {
return term;
}
@Override
public boolean isVisible() {
return true;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public TokenType getType() {
return TokenType.OPERATOR;
}
@Override
public String toString() {
return name + "=" + term;
}
}

View file

@ -0,0 +1,52 @@
package org.xbib.cql.elasticsearch.ast;
import org.xbib.cql.elasticsearch.Visitor;
/**
* A name for Elasticsearch fields
*/
public class Name implements Node {
private String name;
private TokenType type;
private boolean visible;
public Name(String name) {
this(name, true);
}
public Name(String name, boolean visible) {
this.name = name;
this.visible = visible;
}
public String getName() {
return name;
}
public void setType(TokenType type) {
this.type = type;
}
@Override
public TokenType getType() {
return type;
}
@Override
public boolean isVisible() {
return visible;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return name;
}
}

View file

@ -0,0 +1,16 @@
package org.xbib.cql.elasticsearch.ast;
import org.xbib.cql.elasticsearch.Visitor;
/**
* This node class is the base class for the Elasticsearch Query Lange abstract syntax tree
*/
public interface Node {
void accept(Visitor visitor);
boolean isVisible();
TokenType getType();
}

View file

@ -0,0 +1,62 @@
package org.xbib.cql.elasticsearch.ast;
import org.xbib.cql.elasticsearch.Visitor;
/**
* Elasticsearch operators
*/
public enum Operator implements Node {
EQUALS(2),
NOT_EQUALS(2),
RANGE_LESS_THAN(2),
RANGE_LESS_OR_EQUALS(2),
RANGE_GREATER_THAN(2),
RANGE_GREATER_OR_EQUAL(2),
RANGE_WITHIN(2),
AND(2),
ANDNOT(2),
OR(2),
PROX(2),
ALL(2),
ANY(2),
PHRASE(2),
TERM_FILTER(2),
QUERY_FILTER(2),
SORT(0),
TERMS_FACET(0),
OR_FILTER(2),
AND_FILTER(2),
MATCH_ALL(0);
private final int arity;
Operator(int arity) {
this.arity = arity;
}
@Override
public boolean isVisible() {
return true;
}
@Override
public TokenType getType() {
return TokenType.OPERATOR;
}
public int getArity() {
return arity;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
return this.name();
}
}

View file

@ -0,0 +1,213 @@
package org.xbib.cql.elasticsearch.ast;
import static java.util.stream.Collectors.toList;
import org.xbib.cql.elasticsearch.Visitor;
import org.xbib.cql.util.DateUtil;
import org.xbib.cql.util.QuotedStringTokenizer;
import org.xbib.cql.util.UnterminatedQuotedStringException;
import java.util.Collections;
import java.util.Date;
import java.util.EnumSet;
import java.util.List;
import java.util.regex.Pattern;
import java.util.stream.Stream;
import java.util.stream.StreamSupport;
/**
* Elasticsearch query tokens.
*/
public class Token implements Node {
public enum TokenClass {
NORMAL, ALL, WILDCARD, BOUNDARY, PROTECTED
}
private TokenType type;
private String value;
private String stringvalue;
private Boolean booleanvalue;
private Long longvalue;
private Double doublevalue;
private Date datevalue;
private List<String> values;
private final EnumSet<TokenClass> tokenClass;
public Token(String value) {
this.value = value;
this.tokenClass = EnumSet.of(TokenClass.NORMAL);
this.type = TokenType.STRING;
// if this string is equal to true/false or on/off or yes/no, convert silently to bool
if (value.equals("true") || value.equals("yes") || value.equals("on")) {
this.booleanvalue = true;
this.value = null;
this.type = TokenType.BOOL;
} else if (value.equals("false") || value.equals("no") || value.equals("off")) {
this.booleanvalue = false;
this.value = null;
this.type = TokenType.BOOL;
}
if (this.value != null) {
// protected?
if (value.startsWith("\"") && value.endsWith("\"")) {
this.stringvalue = value;
this.value = value.substring(1, value.length() - 1).replaceAll("\\\\\"", "\"");
this.values = parseQuot(this.value);
tokenClass.add(TokenClass.PROTECTED);
}
// wildcard?
if (this.value.indexOf('*') >= 0 || this.value.indexOf('?') >= 0) {
tokenClass.add(TokenClass.WILDCARD);
// all?
if (this.value.length() == 1) {
tokenClass.add(TokenClass.ALL);
}
}
// prefix?
if (this.value.length() > 0 && this.value.charAt(0) == '^') {
tokenClass.add(TokenClass.BOUNDARY);
this.value = this.value.substring(1);
}
}
}
public Token(Boolean value) {
this.booleanvalue = value;
this.type = TokenType.BOOL;
this.tokenClass = EnumSet.of(TokenClass.NORMAL);
}
public Token(Long value) {
this.longvalue = value;
this.type = TokenType.INT;
this.tokenClass = EnumSet.of(TokenClass.NORMAL);
}
public Token(Double value) {
this.doublevalue = value;
this.type = TokenType.FLOAT;
this.tokenClass = EnumSet.of(TokenClass.NORMAL);
}
public Token(Date value) {
this.datevalue = value;
// this will enforce dates to get formatted as long values (years)
this.longvalue = Long.parseLong(DateUtil.formatDate(datevalue, "yyyy"));
this.type = TokenType.DATETIME;
this.tokenClass = EnumSet.of(TokenClass.NORMAL);
}
/**
* Same as toString(), but ignore stringvalue.
*/
public String getString() {
StringBuilder sb = new StringBuilder();
if (booleanvalue != null) {
sb.append(booleanvalue);
} else if (longvalue != null) {
sb.append(longvalue);
} else if (doublevalue != null) {
sb.append(doublevalue);
} else if (datevalue != null) {
sb.append(DateUtil.formatDateISO(datevalue));
} else if (value != null) {
sb.append(value);
}
return sb.toString();
}
public Boolean getBoolean() {
return booleanvalue;
}
public Long getInteger() {
return longvalue;
}
public Double getFloat() {
return doublevalue;
}
public Date getDate() {
return datevalue;
}
public List<String> getStringList() {
return values;
}
@Override
public TokenType getType() {
return type;
}
@Override
public boolean isVisible() {
return true;
}
@Override
public void accept(Visitor visitor) {
visitor.visit(this);
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
if (booleanvalue != null) {
sb.append(booleanvalue);
} else if (longvalue != null) {
sb.append(longvalue);
} else if (doublevalue != null) {
sb.append(doublevalue);
} else if (datevalue != null) {
sb.append(DateUtil.formatDateISO(datevalue));
} else if (stringvalue != null) {
sb.append(stringvalue);
} else if (value != null) {
sb.append(value);
}
return sb.toString();
}
public boolean isProtected() {
return tokenClass.contains(TokenClass.PROTECTED);
}
public boolean isBoundary() {
return tokenClass.contains(TokenClass.BOUNDARY);
}
public boolean isWildcard() {
return tokenClass.contains(TokenClass.WILDCARD);
}
public boolean isAll() {
return tokenClass.contains(TokenClass.ALL);
}
private List<String> parseQuot(String s) {
try {
QuotedStringTokenizer qst = new QuotedStringTokenizer(s, " \t\n\r\f", "\"", '\\', false);
Iterable<String> iterable = () -> qst;
Stream<String> stream = StreamSupport.stream(iterable.spliterator(), false);
return stream.filter(str -> !word.matcher(str).matches()).collect(toList());
} catch (UnterminatedQuotedStringException e) {
return Collections.singletonList(s);
}
}
private final static Pattern word = Pattern.compile("[\\P{IsWord}]");
}

View file

@ -0,0 +1,9 @@
package org.xbib.cql.elasticsearch.ast;
/**
* Elasticsearch query language token types.
*/
public enum TokenType {
STRING, BOOL, INT, FLOAT, DATETIME, NAME, OPERATOR, EXPRESSION
}

View file

@ -0,0 +1,4 @@
/**
* Classes for abstract syntax tree construction for Elasticsearch query generation.
*/
package org.xbib.cql.elasticsearch.ast;

View file

@ -0,0 +1,93 @@
package org.xbib.cql.elasticsearch.model;
import org.xbib.cql.QueryFacet;
/**
* Elasticsearch facet.
*
* @param <V> parameter type
*/
public final class ElasticsearchFacet<V> implements QueryFacet<V>, Comparable<ElasticsearchFacet<V>> {
public enum Type {
TERMS,
RANGE,
HISTOGRAM,
DATEHISTOGRAM,
FILTER,
QUERY,
STATISTICAL,
TERMS_STATS,
GEO_DISTANCE
}
public static int DEFAULT_FACET_SIZE = 10;
private Type type;
private String name;
private V value;
private int size;
public ElasticsearchFacet(Type type, String name, V value) {
this(type, name, value, DEFAULT_FACET_SIZE);
}
public ElasticsearchFacet(Type type, String name, V value, int size) {
this.type = type;
this.name = name;
this.value = value;
this.size = size;
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public String getName() {
return name;
}
public void setType(Type type) {
this.type = type;
}
public Type getType() {
return type;
}
@Override
public void setValue(V value) {
this.value = value;
}
@Override
public V getValue() {
return value;
}
@Override
public int getSize() {
return size;
}
@Override
public String getFilterName() {
return name;
}
@Override
public int compareTo(ElasticsearchFacet<V> o) {
return name.compareTo(((ElasticsearchFacet) o).getName());
}
@Override
public String toString() {
return "facet [name=" + name + ",value=" + value + ",size=" + size + "]";
}
}

View file

@ -0,0 +1,53 @@
package org.xbib.cql.elasticsearch.model;
import org.xbib.cql.QueryFilter;
import org.xbib.cql.elasticsearch.ast.Operator;
/**
* Elasticsearch filter.
* @param <V> parameter type
*/
public class ElasticsearchFilter<V> implements QueryFilter<V>, Comparable<ElasticsearchFilter<V>> {
private String name;
private V value;
private Operator op;
public ElasticsearchFilter(String name, V value, Operator op) {
this.name = name;
this.op = op;
this.value = value;
}
public void setName(String name) {
this.name = name;
}
public String getName() {
return name;
}
public void setValue(V value) {
this.value = value;
}
public V getValue() {
return value;
}
public Operator getFilterOperation() {
return op;
}
@Override
public int compareTo(ElasticsearchFilter<V> o) {
return toString().compareTo(o.toString());
}
@Override
public String toString() {
return name + " " + op + " " + value;
}
}

View file

@ -0,0 +1,200 @@
package org.xbib.cql.elasticsearch.model;
import org.xbib.cql.elasticsearch.ast.Expression;
import org.xbib.cql.elasticsearch.ast.Name;
import org.xbib.cql.elasticsearch.ast.Node;
import org.xbib.cql.elasticsearch.ast.Operator;
import org.xbib.cql.elasticsearch.ast.Token;
import org.xbib.cql.elasticsearch.ast.TokenType;
import org.xbib.cql.model.CQLQueryModel;
import java.util.HashMap;
import java.util.Map;
import java.util.Stack;
/**
* Elasticsearch query model.
*/
public final class ElasticsearchQueryModel {
private final Map<String, Expression> conjunctivefilters;
private final Map<String, Expression> disjunctivefilters;
private final Map<String, Expression> facets;
private Expression sortexpr;
public ElasticsearchQueryModel() {
this.conjunctivefilters = new HashMap<>();
this.disjunctivefilters = new HashMap<>();
this.facets = new HashMap<>();
}
/**
* Determine if the key has a type. Default type is string.
*
* @param key the key to check
* @return the type of the key
*/
public TokenType getElasticsearchType(String key) {
if ("datetime".equals(key)) {
return TokenType.DATETIME;
}
if ("int".equals(key)) {
return TokenType.INT;
}
if ("long".equals(key)) {
return TokenType.INT;
}
if ("float".equals(key)) {
return TokenType.FLOAT;
}
return TokenType.STRING;
}
/**
* Get expression visibility of a given context.
*
* @param context the context
* @return true if visible
*/
public boolean getVisibility(String context) {
return !CQLQueryModel.isFacetContext(context)
&& !CQLQueryModel.isFilterContext(context)
&& !CQLQueryModel.isOptionContext(context);
}
/**
* Check if this context is the facet context.
*
* @param context the context
* @return true if facet context
*/
public boolean isFacetContext(String context) {
return CQLQueryModel.isFacetContext(context);
}
/**
* Check if this context is the filter context.
*
* @param context the context
* @return true if filter context
*/
public boolean isFilterContext(String context) {
return CQLQueryModel.isFilterContext(context);
}
public boolean hasFacets() {
return !facets.isEmpty();
}
public void addFacet(String key, String value) {
ElasticsearchFacet<Node> facet = new ElasticsearchFacet<Node>(ElasticsearchFacet.Type.TERMS, key, new Name(value));
facets.put(facet.getName(), new Expression(Operator.TERMS_FACET, facet.getValue()));
}
public Expression getFacetExpression() {
return new Expression(Operator.TERMS_FACET, facets.values().toArray(new Node[facets.size()]));
}
public void addConjunctiveFilter(String name, Node value, Operator op) {
addFilter(conjunctivefilters, new ElasticsearchFilter<>(name, value, op));
}
public void addDisjunctiveFilter(String name, Node value, Operator op) {
addFilter(disjunctivefilters, new ElasticsearchFilter<>(name, value, op));
}
public boolean hasFilter() {
return !conjunctivefilters.isEmpty() || !disjunctivefilters.isEmpty();
}
/**
* Get filter expression.
* Only one filter expression is allowed per query.
* First, build conjunctive and disjunctive filter terms.
* If both are null, there is no filter at all.
* Otherwise, combine conjunctive and disjunctive filter terms with a
* disjunction, and apply filter function, and return this expression.
*
* @return a single filter expression or null if there are no filter terms
*/
public Expression getFilterExpression() {
if (!hasFilter()) {
return null;
}
Expression conjunctiveclause = null;
if (!conjunctivefilters.isEmpty()) {
conjunctiveclause = new Expression(Operator.AND,
conjunctivefilters.values().toArray(new Node[conjunctivefilters.size()]));
}
Expression disjunctiveclause = null;
if (!disjunctivefilters.isEmpty()) {
disjunctiveclause = new Expression(Operator.OR,
disjunctivefilters.values().toArray(new Node[disjunctivefilters.size()]));
}
if (conjunctiveclause != null && disjunctiveclause == null) {
return conjunctiveclause;
} else if (conjunctiveclause == null && disjunctiveclause != null) {
return disjunctiveclause;
} else {
return new Expression(Operator.OR, conjunctiveclause, disjunctiveclause);
}
}
/**
* Add sort expression.
*
* @param indexAndModifier the index with modifiers
*/
public void setSort(Stack<Node> indexAndModifier) {
this.sortexpr = new Expression(Operator.SORT, reverse(indexAndModifier).toArray(new Node[indexAndModifier.size()]));
}
/**
* Get sort expression.
*
* @return the sort expression
*/
public Expression getSort() {
return sortexpr;
}
/**
* Helper method to add a filter.
*
* @param filters the filter list
* @param filter the filter to add
*/
private void addFilter(Map<String, Expression> filters, ElasticsearchFilter<Node> filter) {
Name name = new Name(filter.getName());
name.setType(getElasticsearchType(filter.getName()));
Node value = filter.getValue();
if (value instanceof Token) {
value = new Expression(filter.getFilterOperation(), name, value);
}
if (filters.containsKey(filter.getName())) {
Expression expression = filters.get(filter.getName());
expression = new Expression(expression, value);
filters.put(filter.getName(), expression);
} else {
filters.put(filter.getName(), (Expression) value);
}
}
/**
* Helper method to reverse an expression stack.
*
* @param in the stack to reverse
* @return the reversed stack
*/
private Stack<Node> reverse(Stack<Node> in) {
Stack<Node> out = new Stack<Node>();
while (!in.empty()) {
out.push(in.pop());
}
return out;
}
}

View file

@ -0,0 +1,4 @@
/**
* Classes for Elasticsearch query model.
*/
package org.xbib.cql.elasticsearch.model;

View file

@ -0,0 +1,4 @@
/**
* Classes for compiling CQL to Elasticsearch queries.
*/
package org.xbib.cql.elasticsearch;

View file

@ -0,0 +1,230 @@
package org.xbib.cql.model;
import org.xbib.cql.AbstractNode;
import org.xbib.cql.BooleanOperator;
import org.xbib.cql.Term;
import org.xbib.cql.model.breadcrumb.FacetBreadcrumbTrail;
import org.xbib.cql.model.breadcrumb.FilterBreadcrumbTrail;
import org.xbib.cql.model.breadcrumb.OptionBreadcrumbTrail;
/**
* A CQL query model.
* Special contexts are <code>facet</code>, <code>filter</code>,
* and <code>option</code>.
* These contexts form breadcrumb trails.
* Bread crumbs provide a means for a server to track an chronologically
* ordered set of client actions. Bread crumbs are typically rendered as a
* user-driven constructed list of links, and are useful when
* users select them to drill down and up in a structure,
* so that they can find their way and have a notion of where they
* currently are.
* Bread crumbs in the original sense just represent where users are
* situated in a site hierarchy. For example, when browsing a
* library catalog, bread crumbs could look like this:
* <pre>
* Home &gt; Scientific literature &gt; Arts &amp; Human &gt; Philosophy
* </pre>
* or
* <pre>
* Main library &gt; Branch library &gt; First floor &gt; Rare book room
* </pre>
* These items would be rendered as links to the corresponding location.
* Classes that implement this interface are responsible for managing
* such a bread crumb structure. A typical implementation regards
* bread crumbs as a set of elements.
* When a bread crumb is activated that was not in the set yet,
* it would add it to the set, or when a bread crumb is activated
* that is already on the set, it would roll back to the corresponding depth.
* In this model, multiple bread crumb trails may exist side by side. They are
* separate and do not depend on each other. There is a list of bread crumb
* trails, and the notion of a currently active bread crumb within a trail.
* This model does not make any presumptions on how it should interact with
* breadcrumbs except that a breadcrumb model should be serializable into
* a writer.
*/
public final class CQLQueryModel {
/**
* Contexts 'facet', 'filter', and 'option'.
*/
public static final String FACET_INDEX_NAME = "facet";
public static final String FILTER_INDEX_NAME = "filter";
public static final String OPTION_INDEX_NAME = "option";
private static final String AND_OP = " and ";
private static final String OR_OP = " or ";
/**
* the CQL query string.
*/
private String query;
/**
* breadcrumb trail for facets.
*/
private FacetBreadcrumbTrail facetTrail;
/**
* breadcrumb trail for conjunctive filters.
*/
private FilterBreadcrumbTrail conjunctivefilterTrail;
/**
* breadcrumb trail for disjunctive filters.
*/
private FilterBreadcrumbTrail disjunctivefilterTrail;
/**
* breadcrumb trail for options.
*/
private OptionBreadcrumbTrail optionTrail;
public CQLQueryModel() {
this.facetTrail = new FacetBreadcrumbTrail();
this.conjunctivefilterTrail = new FilterBreadcrumbTrail(BooleanOperator.AND);
this.disjunctivefilterTrail = new FilterBreadcrumbTrail(BooleanOperator.OR);
this.optionTrail = new OptionBreadcrumbTrail();
}
public void setQuery(String query) {
this.query = query;
}
public String getQuery() {
return query;
}
public void addFacet(Facet<Term> facet) {
facetTrail.add(facet);
}
public void removeFacet(Facet<Term> facet) {
facetTrail.remove(facet);
}
/**
* Add CQL filter.
*
* @param op boolean operator, AND for conjunctive filter, OR for disjunctive filter
* @param filter the filter to add
*/
public void addFilter(BooleanOperator op, Filter<AbstractNode> filter) {
if (op == BooleanOperator.AND && !disjunctivefilterTrail.contains(filter)) {
conjunctivefilterTrail.add(filter);
}
if (op == BooleanOperator.OR && !conjunctivefilterTrail.contains(filter)) {
disjunctivefilterTrail.add(filter);
}
}
/**
* Remove CQL filter.
*
* @param filter the filter to remove
*/
public void removeFilter(Filter<AbstractNode> filter) {
conjunctivefilterTrail.remove(filter);
disjunctivefilterTrail.remove(filter);
}
public void addOption(Option<?> option) {
optionTrail.add(option);
}
public void removeOption(Option<?> option) {
optionTrail.remove(option);
}
public FacetBreadcrumbTrail getFacetTrail() {
return facetTrail;
}
public String getFilterTrail() {
StringBuilder sb = new StringBuilder();
if (!conjunctivefilterTrail.isEmpty()) {
sb.append(AND_OP).append(conjunctivefilterTrail.toString());
}
if (disjunctivefilterTrail.size() == 1) {
sb.append(OR_OP).append(disjunctivefilterTrail.toString());
} else if (disjunctivefilterTrail.size() > 1) {
sb.append(AND_OP).append(disjunctivefilterTrail.toString());
}
return sb.toString();
}
/**
* Get the option breadcrumb trail.
*
* @return the option breadcrumb trail
*/
public OptionBreadcrumbTrail getOptionTrail() {
return optionTrail;
}
/**
* Get query of a given context.
*
* @param context the context
* @return true if visible, false if not
*/
public static boolean isVisible(String context) {
return !isFacetContext(context)
&& !isFilterContext(context)
&& !isOptionContext(context);
}
/**
* Check if this context is the facet context.
*
* @param context the context
* @return true if facet contet
*/
public static boolean isFacetContext(String context) {
return FACET_INDEX_NAME.equals(context);
}
/**
* Check if this context is the filter context.
*
* @param context the context
* @return true if filter context
*/
public static boolean isFilterContext(String context) {
return FILTER_INDEX_NAME.equals(context);
}
/**
* Check if this context is the option context
*
* @param context the context
* @return true if option context
*/
public static boolean isOptionContext(String context) {
return OPTION_INDEX_NAME.equals(context);
}
/**
* Write the CQL query model as CQL string.
*
* @return the query model as CQL
*/
public String toCQL() {
StringBuilder sb = new StringBuilder(query);
String facets = getFacetTrail().toCQL();
if (facets.length() > 0) {
sb.append(AND_OP).append(facets);
}
String filters = getFilterTrail();
if (filters.length() > 0) {
sb.append(filters);
}
String options = getOptionTrail().toCQL();
if (options.length() > 0) {
sb.append(AND_OP).append(options);
}
return sb.toString();
}
}

View file

@ -0,0 +1,70 @@
package org.xbib.cql.model;
import org.xbib.cql.QueryFacet;
/**
* Facet.
*
* @param <V> parameter type
*/
public final class Facet<V> implements QueryFacet<V>, Comparable<Facet<V>> {
private int size;
private String filterName;
private String name;
private V value;
public Facet(String name) {
this.name = name;
}
public Facet(String name, String filterName, int size) {
this.name = name;
this.filterName = filterName;
this.size = size;
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public String getName() {
return name;
}
@Override
public void setValue(V value) {
this.value = value;
}
@Override
public V getValue() {
return value;
}
@Override
public int getSize() {
return size;
}
@Override
public String getFilterName() {
return filterName;
}
public String toCQL() {
return CQLQueryModel.FACET_INDEX_NAME + "." + name + " = " + value;
}
@Override
public int compareTo(Facet<V> o) {
return name.compareTo((o).getName());
}
@Override
public String toString() {
return toCQL();
}
}

View file

@ -0,0 +1,68 @@
package org.xbib.cql.model;
import org.xbib.cql.QueryFilter;
import org.xbib.cql.Comparitor;
/**
* Filter.
* @param <V> filter parameter type
*/
public class Filter<V> implements QueryFilter<V>, Comparable<Filter<V>> {
private String name;
private V value;
private Comparitor op;
private String label;
public Filter(String name, V value, Comparitor op) {
this.name = name;
this.op = op;
this.value = value;
}
public Filter(String name, V value, Comparitor op, String label) {
this.name = name;
this.op = op;
this.value = value;
this.label = label;
}
public void setName(String name) {
this.name = name;
}
public String getName() {
return name;
}
public void setValue(V value) {
this.value = value;
}
public V getValue() {
return value;
}
public Comparitor getFilterOperation() {
return op;
}
public String getLabel() {
return label;
}
public String toCQL() {
return CQLQueryModel.FILTER_INDEX_NAME + "." + name + " " + op.getToken() + " " + value;
}
@Override
public int compareTo(Filter<V> o) {
return toString().compareTo((o).toString());
}
@Override
public String toString() {
return name + " " + op + " " + value;
}
}

View file

@ -0,0 +1,48 @@
package org.xbib.cql.model;
import org.xbib.cql.QueryOption;
/**
* Option.
* @param <V> parameter type
*/
public class Option<V> implements QueryOption<V>, Comparable<Option<V>> {
private String name;
private V value;
@Override
public void setName(String name) {
this.name = name;
}
@Override
public String getName() {
return name;
}
@Override
public void setValue(V value) {
this.value = value;
}
@Override
public V getValue() {
return value;
}
public String toCQL() {
return CQLQueryModel.OPTION_INDEX_NAME + "." + name + " = " + value;
}
@Override
public int compareTo(Option<V> o) {
return name.compareTo((o).getName());
}
@Override
public String toString() {
return toCQL();
}
}

View file

@ -0,0 +1,32 @@
package org.xbib.cql.model.breadcrumb;
import org.xbib.cql.model.Facet;
import java.util.Iterator;
import java.util.TreeSet;
/**
* Facet breadcrumb trail.
*/
public class FacetBreadcrumbTrail extends TreeSet<Facet> {
@Override
public String toString() {
return toCQL();
}
public String toCQL() {
StringBuilder sb = new StringBuilder();
if (isEmpty()) {
return sb.toString();
}
Iterator<Facet> it = iterator();
if (it.hasNext()) {
sb.append(it.next().toCQL());
}
while (it.hasNext()) {
sb.append(" and ").append(it.next().toCQL());
}
return sb.toString();
}
}

View file

@ -0,0 +1,44 @@
package org.xbib.cql.model.breadcrumb;
import org.xbib.cql.BooleanOperator;
import org.xbib.cql.model.Filter;
import java.util.Iterator;
import java.util.TreeSet;
/**
* Filter breadcrumbs.
*/
public class FilterBreadcrumbTrail extends TreeSet<Filter> {
private BooleanOperator op;
public FilterBreadcrumbTrail(BooleanOperator op) {
super();
this.op = op;
}
@Override
public String toString() {
return toCQL();
}
public String toCQL() {
StringBuilder sb = new StringBuilder();
if (isEmpty()) {
return sb.toString();
}
if (op == BooleanOperator.OR && size() > 1) {
sb.append('(');
}
Iterator<Filter> it = this.iterator();
sb.append(it.next().toCQL());
while (it.hasNext()) {
sb.append(' ').append(op).append(' ').append(it.next().toCQL());
}
if (op == BooleanOperator.OR && size() > 1) {
sb.append(')');
}
return sb.toString();
}
}

View file

@ -0,0 +1,39 @@
package org.xbib.cql.model.breadcrumb;
import org.xbib.cql.model.Option;
import java.util.Iterator;
import java.util.TreeSet;
/**
* An Option breadcrumb trail is a trail of attributes (key/value pairs).
* There is no interdependency between attributes; all values are allowed,
* even if they interfere with each other, the trail does not resolve it.
*/
public class OptionBreadcrumbTrail extends TreeSet<Option> {
@Override
public String toString() {
return toCQL();
}
/**
* Conjunct all CQL options to form a valid CQL string.
*
* @return the CQL string
*/
public String toCQL() {
StringBuilder sb = new StringBuilder();
if (isEmpty()) {
return sb.toString();
}
Iterator<Option> it = iterator();
if (it.hasNext()) {
sb.append(it.next().toCQL());
}
while (it.hasNext()) {
sb.append(" and ").append(it.next().toCQL());
}
return sb.toString();
}
}

View file

@ -0,0 +1,4 @@
/**
* Classes for breadcrumbs in the CQL model.
*/
package org.xbib.cql.model.breadcrumb;

View file

@ -0,0 +1,4 @@
/**
* Classes for CQL query modeling.
*/
package org.xbib.cql.model;

View file

@ -0,0 +1,4 @@
/**
* Classes for CQL queries.
*/
package org.xbib.cql;

View file

@ -0,0 +1,316 @@
package org.xbib.cql.util;
import java.text.ParseException;
import java.text.ParsePosition;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.Date;
import java.util.TimeZone;
public class DateUtil {
public static final String ISO_FORMAT_SECONDS = "yyyy-MM-dd'T'HH:mm:ss'Z'";
public static final String ISO_FORMAT_DAYS = "yyyy-MM-dd";
public static final String RFC_FORMAT = "EEE, dd MMM yyyy HH:mm:ss 'GMT'";
public static final TimeZone GMT = TimeZone.getTimeZone("GMT");
private static final Calendar cal = Calendar.getInstance();
private static final SimpleDateFormat sdf = new SimpleDateFormat();
/**
* Number of milliseconds in a standard second.
*/
public static final long MILLIS_PER_SECOND = 1000;
/**
* Number of milliseconds in a standard minute.
*/
public static final long MILLIS_PER_MINUTE = 60 * MILLIS_PER_SECOND;
/**
* Number of milliseconds in a standard hour.
*/
public static final long MILLIS_PER_HOUR = 60 * MILLIS_PER_MINUTE;
/**
* Number of milliseconds in a standard day.
*/
public static final long MILLIS_PER_DAY = 24 * MILLIS_PER_HOUR;
/**
* the date masks
*/
private static final String[] DATE_MASKS = {"yyyy-MM-dd'T'HH:mm:ssz", "yyyy-MM-dd'T'HH:mm:ss'Z'", "yyyy-MM-dd",
"yyyy"};
public static String formatNow() {
return formatDateISO(new Date());
}
public static String formatDate(Date date, String format) {
if (date == null) {
return null;
}
synchronized (sdf) {
sdf.applyPattern(format);
sdf.setTimeZone(GMT);
return sdf.format(date);
}
}
public static String formatDateISO(Date date) {
if (date == null) {
return null;
}
synchronized (sdf) {
sdf.applyPattern(ISO_FORMAT_SECONDS);
sdf.setTimeZone(GMT);
return sdf.format(date);
}
}
public static Date parseDateISO(String value) {
if (value == null) {
return null;
}
synchronized (sdf) {
sdf.applyPattern(ISO_FORMAT_SECONDS);
sdf.setTimeZone(GMT);
sdf.setLenient(true);
try {
return sdf.parse(value);
} catch (ParseException pe) {
// skip
}
sdf.applyPattern(ISO_FORMAT_DAYS);
try {
return sdf.parse(value);
} catch (ParseException pe) {
return null;
}
}
}
public static Date parseDateISO(String value, Date defaultDate) {
if (value == null) {
return defaultDate;
}
synchronized (sdf) {
sdf.applyPattern(ISO_FORMAT_SECONDS);
sdf.setTimeZone(GMT);
sdf.setLenient(true);
try {
return sdf.parse(value);
} catch (ParseException pe) {
// skip
}
sdf.applyPattern(ISO_FORMAT_DAYS);
try {
return sdf.parse(value);
} catch (ParseException pe) {
return defaultDate;
}
}
}
public static String formatDateRFC(Date date) {
if (date == null) {
return null;
}
synchronized (sdf) {
sdf.applyPattern(RFC_FORMAT);
sdf.setTimeZone(GMT);
return sdf.format(date);
}
}
public static Date parseDateRFC(String value) {
if (value == null) {
return null;
}
try {
synchronized (sdf) {
sdf.applyPattern(RFC_FORMAT);
sdf.setTimeZone(GMT);
return sdf.parse(value);
}
} catch (ParseException pe) {
return null;
}
}
public static int getYear() {
synchronized (cal) {
cal.setTime(new Date());
return cal.get(Calendar.YEAR);
}
}
public static String today() {
synchronized (cal) {
cal.setTime(new Date());
return String.format("%04d%02d%02d",
cal.get(Calendar.YEAR),
cal.get(Calendar.MONTH) + 1,
cal.get(Calendar.DAY_OF_MONTH));
}
}
public static int getYear(Date date) {
synchronized (cal) {
cal.setTime(date);
return cal.get(Calendar.YEAR);
}
}
public static Date midnight() {
return DateUtil.midnight(new Date());
}
public static Date midnight(Date date) {
synchronized (cal) {
cal.setTime(date);
cal.set(Calendar.HOUR_OF_DAY, 0);
cal.set(Calendar.MINUTE, 0);
cal.set(Calendar.SECOND, 0);
cal.set(Calendar.MILLISECOND, 0);
return cal.getTime();
}
}
public static Date min() {
return new Date(0L);
}
public static Date now() {
return new Date();
}
public static Date yesterday() {
return yesterday(new Date());
}
public static Date yesterday(Date date) {
return days(date, -1);
}
public static Date tomorrow() {
return tomorrow(new Date());
}
public static Date tomorrow(Date date) {
return days(date, 1);
}
public static Date years(int years) {
return years(new Date(), years);
}
public static Date years(Date date, int years) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.YEAR, years);
return cal.getTime();
}
}
public static Date months(int months) {
return months(new Date(), months);
}
public static Date months(Date date, int months) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.MONTH, months);
return cal.getTime();
}
}
public static Date weeks(int weeks) {
return weeks(new Date(), weeks);
}
public static Date weeks(Date date, int weeks) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.WEEK_OF_YEAR, weeks);
return cal.getTime();
}
}
public static Date days(int days) {
return days(new Date(), days);
}
public static Date days(Date date, int days) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.DAY_OF_YEAR, days);
return cal.getTime();
}
}
public static Date hours(int hours) {
return hours(new Date(), hours);
}
public static Date hours(Date date, int hours) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.HOUR_OF_DAY, hours);
return cal.getTime();
}
}
public static Date minutes(int minutes) {
return minutes(new Date(), minutes);
}
public static Date minutes(Date date, int minutes) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.MINUTE, minutes);
return cal.getTime();
}
}
public static Date seconds(int seconds) {
return seconds(new Date(), seconds);
}
public static Date seconds(Date date, int seconds) {
synchronized (cal) {
cal.setTime(date);
cal.add(Calendar.MINUTE, seconds);
return cal.getTime();
}
}
public static Date parseDate(Object o) {
synchronized (sdf) {
sdf.setTimeZone(GMT);
sdf.setLenient(true);
if (o instanceof Date) {
return (Date) o;
} else if (o instanceof Long) {
Long longvalue = (Long) o;
String s = Long.toString(longvalue);
sdf.applyPattern(DATE_MASKS[3]);
Date d = sdf.parse(s, new ParsePosition(0));
if (d != null) {
return d;
}
} else if (o instanceof String) {
String value = (String) o;
for (String DATE_MASK : DATE_MASKS) {
sdf.applyPattern(DATE_MASK);
Date d = sdf.parse(value, new ParsePosition(0));
if (d != null) {
return d;
}
}
}
return null;
}
}
}

View file

@ -0,0 +1,363 @@
package org.xbib.cql.util;
import java.net.URI;
import java.net.URLDecoder;
import java.nio.charset.Charset;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
/**
* Splits an HTTP query string into a path string and key-value parameter pairs.
* This decoder is for one time use only. Create a new instance for each URI:
* <pre>
* {@link QueryStringDecoder} decoder = new {@link QueryStringDecoder}("/hello?recipient=world&amp;x=1;y=2");
* assert decoder.getPath().equals("/hello");
* assert decoder.getParameters().get("recipient").get(0).equals("world");
* assert decoder.getParameters().get("x").get(0).equals("1");
* assert decoder.getParameters().get("y").get(0).equals("2");
* </pre>
* This decoder can also decode the content of an HTTP POST request whose
* content type is <tt>application/x-www-form-urlencoded</tt>:
* <pre>
* {@link QueryStringDecoder} decoder = new {@link QueryStringDecoder}("recipient=world&amp;x=1;y=2", false);
* </pre>
* <h3>HashDOS vulnerability fix</h3>
* As a workaround to the <a href="http://netty.io/s/hashdos">HashDOS</a> vulnerability, the decoder
* limits the maximum number of decoded key-value parameter pairs, up to {@literal 1024} by
* default, and you can configure it when you construct the decoder by passing an additional
* integer parameter.
*/
public class QueryStringDecoder {
private static final int DEFAULT_MAX_PARAMS = 1024;
private final Charset charset;
private final String uri;
private final boolean hasPath;
private final int maxParams;
private String path;
private Map<String, List<String>> params;
private int nParams;
/**
* Creates a new decoder that decodes the specified URI. The decoder will
* assume that the query string is encoded in UTF-8.
*/
public QueryStringDecoder(String uri) {
this(uri, StandardCharsets.UTF_8);
}
/**
* Creates a new decoder that decodes the specified URI encoded in the
* specified charset.
*/
public QueryStringDecoder(String uri, boolean hasPath) {
this(uri, StandardCharsets.UTF_8, hasPath);
}
/**
* Creates a new decoder that decodes the specified URI encoded in the
* specified charset.
*/
public QueryStringDecoder(String uri, Charset charset) {
this(uri, charset, true);
}
/**
* Creates a new decoder that decodes the specified URI encoded in the
* specified charset.
*/
public QueryStringDecoder(String uri, Charset charset, boolean hasPath) {
this(uri, charset, hasPath, DEFAULT_MAX_PARAMS);
}
/**
* Creates a new decoder that decodes the specified URI encoded in the
* specified charset.
*/
public QueryStringDecoder(String uri, Charset charset, boolean hasPath, int maxParams) {
if (uri == null) {
throw new NullPointerException("getUri");
}
if (charset == null) {
throw new NullPointerException("charset");
}
if (maxParams <= 0) {
throw new IllegalArgumentException(
"maxParams: " + maxParams + " (expected: a positive integer)");
}
this.uri = uri;
this.charset = charset;
this.maxParams = maxParams;
this.hasPath = hasPath;
}
/**
* Creates a new decoder that decodes the specified URI. The decoder will
* assume that the query string is encoded in UTF-8.
*/
public QueryStringDecoder(URI uri) {
this(uri, StandardCharsets.UTF_8);
}
/**
* Creates a new decoder that decodes the specified URI encoded in the
* specified charset.
*/
public QueryStringDecoder(URI uri, Charset charset) {
this(uri, charset, DEFAULT_MAX_PARAMS);
}
/**
* Creates a new decoder that decodes the specified URI encoded in the
* specified charset.
*/
public QueryStringDecoder(URI uri, Charset charset, int maxParams) {
if (uri == null) {
throw new NullPointerException("getUri");
}
if (charset == null) {
throw new NullPointerException("charset");
}
if (maxParams <= 0) {
throw new IllegalArgumentException(
"maxParams: " + maxParams + " (expected: a positive integer)");
}
String rawPath = uri.getRawPath();
if (rawPath != null) {
hasPath = true;
} else {
rawPath = "";
hasPath = false;
}
// Also take care of cut of things like "http://localhost"
this.uri = rawPath + '?' + uri.getRawQuery();
this.charset = charset;
this.maxParams = maxParams;
}
/**
* Returns the uri used to initialize this {@link QueryStringDecoder}.
*/
public String uri() {
return uri;
}
/**
* Returns the decoded path string of the URI.
*/
public String path() {
if (path == null) {
if (!hasPath) {
return path = "";
}
int pathEndPos = uri.indexOf('?');
if (pathEndPos < 0) {
path = uri;
} else {
return path = uri.substring(0, pathEndPos);
}
}
return path;
}
/**
* Returns the decoded key-value parameter pairs of the URI.
*/
public Map<String, List<String>> parameters() {
if (params == null) {
if (hasPath) {
int pathLength = path().length();
if (uri.length() == pathLength) {
return Collections.emptyMap();
}
decodeParams(uri.substring(pathLength + 1));
} else {
if (uri.isEmpty()) {
return Collections.emptyMap();
}
decodeParams(uri);
}
}
return params;
}
private void decodeParams(String s) {
Map<String, List<String>> params = this.params = new LinkedHashMap<String, List<String>>();
nParams = 0;
String name = null;
int pos = 0; // Beginning of the unprocessed region
int i; // End of the unprocessed region
char c; // Current character
for (i = 0; i < s.length(); i++) {
c = s.charAt(i);
if (c == '=' && name == null) {
if (pos != i) {
name = decodeComponent(s.substring(pos, i), charset);
}
pos = i + 1;
// http://www.w3.org/TR/html401/appendix/notes.html#h-B.2.2
} else if (c == '&' || c == ';') {
if (name == null && pos != i) {
// We haven't seen an `=' so far but moved forward.
// Must be a param of the form '&a&' so add it with
// an empty value.
if (!addParam(params, decodeComponent(s.substring(pos, i), charset), "")) {
return;
}
} else if (name != null) {
if (!addParam(params, name, decodeComponent(s.substring(pos, i), charset))) {
return;
}
name = null;
}
pos = i + 1;
}
}
if (pos != i) { // Are there characters we haven't dealt with?
if (name == null) { // Yes and we haven't seen any `='.
addParam(params, decodeComponent(s.substring(pos, i), charset), "");
} else { // Yes and this must be the last value.
addParam(params, name, decodeComponent(s.substring(pos, i), charset));
}
} else if (name != null) { // Have we seen a name without value?
addParam(params, name, "");
}
}
private boolean addParam(Map<String, List<String>> params, String name, String value) {
if (nParams >= maxParams) {
return false;
}
List<String> values = params.get(name);
if (values == null) {
values = new ArrayList<String>(1); // Often there's only 1 value.
params.put(name, values);
}
values.add(value);
nParams++;
return true;
}
/**
* Decodes a bit of an URL encoded by a browser.
* This is equivalent to calling {@link #decodeComponent(String, Charset)}
* with the UTF-8 charset (recommended to comply with RFC 3986, Section 2).
*
* @param s The string to decode (can be empty).
* @return The decoded string, or {@code s} if there's nothing to decode.
* If the string to decode is {@code null}, returns an empty string.
* @throws IllegalArgumentException if the string contains a malformed
* escape sequence.
*/
public static String decodeComponent(final String s) {
return decodeComponent(s, StandardCharsets.UTF_8);
}
/**
* Decodes a bit of an URL encoded by a browser.
* The string is expected to be encoded as per RFC 3986, Section 2.
* This is the encoding used by JavaScript functions {@code encodeURI}
* and {@code encodeURIComponent}, but not {@code escape}. For example
* in this encoding, &eacute; (in Unicode {@code U+00E9} or in UTF-8
* {@code 0xC3 0xA9}) is encoded as {@code %C3%A9} or {@code %c3%a9}.
* This is essentially equivalent to calling
* {@link URLDecoder#decode(String, String) URLDecoder.decode(s, charset.name())}
* except that it's over 2x faster and generates less garbage for the GC.
* Actually this function doesn't allocate any memory if there's nothing
* to decode, the argument itself is returned.
*
* @param s The string to decode (can be empty).
* @param charset The charset to use to decode the string (should really
* be UTF-8).
* @return The decoded string, or {@code s} if there's nothing to decode.
* If the string to decode is {@code null}, returns an empty string.
* @throws IllegalArgumentException if the string contains a malformed
* escape sequence.
*/
public static String decodeComponent(final String s, final Charset charset) {
if (s == null) {
return "";
}
final int size = s.length();
boolean modified = false;
for (int i = 0; i < size; i++) {
final char c = s.charAt(i);
if (c == '%' || c == '+') {
modified = true;
break;
}
}
if (!modified) {
return s;
}
final byte[] buf = new byte[size];
int pos = 0; // position in `buf'.
for (int i = 0; i < size; i++) {
char c = s.charAt(i);
switch (c) {
case '+':
buf[pos++] = ' '; // "+" -> " "
break;
case '%':
if (i == size - 1) {
throw new IllegalArgumentException("unterminated escape"
+ " sequence at end of string: " + s);
}
c = s.charAt(++i);
if (c == '%') {
buf[pos++] = '%'; // "%%" -> "%"
break;
}
if (i == size - 1) {
throw new IllegalArgumentException("partial escape"
+ " sequence at end of string: " + s);
}
c = decodeHexNibble(c);
final char c2 = decodeHexNibble(s.charAt(++i));
if (c == Character.MAX_VALUE || c2 == Character.MAX_VALUE) {
throw new IllegalArgumentException(
"invalid escape sequence `%" + s.charAt(i - 1)
+ s.charAt(i) + "' at index " + (i - 2)
+ " of: " + s);
}
c = (char) (c * 16 + c2);
// Fall through.
default:
buf[pos++] = (byte) c;
break;
}
}
return new String(buf, 0, pos, charset);
}
/**
* Helper to decode half of a hexadecimal number from a string.
*
* @param c The ASCII character of the hexadecimal number to decode.
* Must be in the range {@code [0-9a-fA-F]}.
* @return The hexadecimal value represented in the ASCII character
* given, or {@link Character#MAX_VALUE} if the character is invalid.
*/
private static char decodeHexNibble(final char c) {
if ('0' <= c && c <= '9') {
return (char) (c - '0');
} else if ('a' <= c && c <= 'f') {
return (char) (c - 'a' + 10);
} else if ('A' <= c && c <= 'F') {
return (char) (c - 'A' + 10);
} else {
return Character.MAX_VALUE;
}
}
}

View file

@ -0,0 +1,233 @@
package org.xbib.cql.util;
import java.util.Iterator;
import java.util.NoSuchElementException;
import java.util.StringTokenizer;
/**
* A string tokenizer that understands quotes and escape characters.
*/
public class QuotedStringTokenizer extends StringTokenizer implements Iterator<String> {
private String str;
private String delim;
private String quotes;
private char escape;
private boolean returnDelims;
private int pos;
private int len;
private StringBuilder token;
/**
* Constructs a string tokenizer for the specified string.
* The default delimiters for StringTokenizer are used.
* "\"\'" are used as quotes, and '\\' is used as the escape character.
*/
public QuotedStringTokenizer(String str) {
this(str, " \t\n\r\f", "\"\'", '\\', false);
}
/**
* Constructs a string tokenizer for the specified string.
* "\"\'" are used as quotes, and '\\' is used as the escape character.
*/
public QuotedStringTokenizer(String str, String delim) {
this(str, delim, "\"\'", '\\', false);
}
/**
* Constructs a string tokenizer for the specified string.
* Quotes cannot be delimiters, and the escape character can be neither a
* quote nor a delimiter.
*/
public QuotedStringTokenizer(String str, String delim, String quotes, char escape, boolean returnDelims) {
super(str, delim, returnDelims);
this.str = str;
this.len = str.length();
this.delim = delim;
this.quotes = quotes;
this.pos = 0;
for (int i = 0; i < quotes.length(); i++) {
if (delim.indexOf(quotes.charAt(i)) >= 0) {
throw new IllegalArgumentException("Invalid quote character '" + quotes.charAt(i) + "'");
}
}
this.escape = escape;
if (delim.indexOf(escape) >= 0) {
throw new IllegalArgumentException("Invalid escape character '" + escape + "'");
}
if (quotes.indexOf(escape) >= 0) {
throw new IllegalArgumentException("Invalid escape character '" + escape + "'");
}
this.returnDelims = returnDelims;
}
/**
* Returns the position of the next non-delimiter character.
* Pre-condition: not inside a quoted string (token).
*/
private int skipDelim(int pos) {
int p = pos;
while (p < len && delim.indexOf(str.charAt(p)) >= 0) {
p++;
}
return p;
}
/**
* Returns the position of the next delimiter character after the token.
* If collect is true, collects the token into the StringBuffer.
* Pre-condition: not on a delimiter.
*/
private int skipToken(int pos, boolean collect) {
int p = pos;
if (collect) {
token = new StringBuilder();
}
boolean quoted = false;
char quote = '\000';
boolean escaped = false;
for (; p < len; p++) {
char curr = str.charAt(p);
if (escaped) {
escaped = false;
if (collect) {
token.append(curr);
}
continue;
}
if (curr == escape) { // escape character
escaped = true;
continue;
}
if (quoted) {
if (curr == quote) { // closing quote
quoted = false;
quote = '\000';
} else if (collect) {
token.append(curr);
}
continue;
}
if (quotes.indexOf(curr) >= 0) {
// opening quote
quoted = true;
quote = curr;
continue;
}
if (delim.indexOf(str.charAt(p)) >= 0) {
// unquoted delimiter
break;
}
if (collect) {
token.append(curr);
}
}
if (escaped || quoted) {
throw new UnterminatedQuotedStringException(str);
}
return p;
}
/**
* Tests if there are more tokens available from this tokenizer's string.
* Pre-condition: not inside a quoted string (token).
*/
@Override
public boolean hasMoreTokens() {
if (!returnDelims) {
pos = skipDelim(pos);
}
return (pos < len);
}
/**
* Returns the next token from this string tokenizer.
*/
@Override
public String nextToken() {
if (!returnDelims) {
pos = skipDelim(pos);
}
if (pos >= len) {
throw new NoSuchElementException();
}
if (returnDelims && delim.indexOf(str.charAt(pos)) >= 0) {
return str.substring(pos, ++pos);
}
pos = skipToken(pos, true);
return token.toString();
}
/**
* Returns the next token in this string tokenizer's string.
*/
@Override
public String nextToken(String delim) {
this.delim = delim;
return nextToken();
}
/**
* Calculates the number of times that this tokenizer's nextToken method
* can be called before it generates an exception.
*/
@Override
public int countTokens() {
int count = 0;
int dcount = 0;
int curr = pos;
while (curr < len) {
if (delim.indexOf(str.charAt(curr)) >= 0) {
curr++;
dcount++;
} else {
curr = skipToken(curr, false);
count++;
}
}
if (returnDelims) {
return count + dcount;
}
return count;
}
/**
* Returns the same value as the hasMoreTokens method.
*/
@Override
public boolean hasMoreElements() {
return hasMoreTokens();
}
/**
* Returns the same value as the nextToken method, except that its declared
* return value is Object rather than String.
*/
@Override
public Object nextElement() {
return nextToken();
}
@Override
public boolean hasNext() {
return hasMoreTokens();
}
@Override
public String next() {
return nextToken();
}
@Override
public void remove() {
}
}

View file

@ -0,0 +1,11 @@
package org.xbib.cql.util;
/**
* Exception for string tokenizing.
*/
public class UnterminatedQuotedStringException extends RuntimeException {
public UnterminatedQuotedStringException(String msg) {
super(msg);
}
}

View file

@ -0,0 +1,4 @@
/**
* Classes for CQL utilities.
*/
package org.xbib.cql.util;

View file

@ -0,0 +1,194 @@
package org.xbib.cql;
import java.io.IOException;
%%
%class CQLLexer
%implements CQLTokens
%unicode
%integer
%eofval{
return 0;
%eofval}
%line
%column
%{
private Object yylval;
private int token;
private StringBuilder sb = new StringBuilder();
public int getToken() {
return token;
}
public int nextToken() {
try {
token = yylex();
return token;
}
catch (IOException e) {
throw new RuntimeException(e);
}
}
public Object getSemantic() {
return yylval;
}
public int getLine() {
return yyline;
}
public int getColumn() {
return yycolumn;
}
%}
NL = \n | \r | \r\n
LPAR = "("
RPAR = ")"
AND = [aA][nN][dD]
OR = [oO][rR]
NOT = [nN][oO][tT]
PROX = [pP][rR][oO][xX]
SORTBY = [sS][oO][rR][tT][bB][yY]
SIMPLESTRING = [^ \t\"()=<>\/]+
QUOTEDSTRING = [^\"]
LT = "<"
GT = ">"
EQ = "="
GE = ">="
LE = "<="
NE = "<>"
EXACT = "=="
NAMEDCOMPARITORS = [cC][qQ][lL] "." [eE][xX][aA][cC][tT] | [eE][xX][aA][cC][tT] | [cC][qQ][lL] "." [wW][iI][tT][hH][iI][nN] | [wW][iI][tT][hH][iI][nN] | [cC][qQ][lL] "." [aA][dD][jJ] | [aA][dD][jJ] | [cC][qQ][lL] "." [aA][lL][lL] | [aA][lL][lL] | [cC][qQ][lL] "." [aA][nN][yY] | [aA][nN][yY] | [cC][qQ][lL] "." [eE][nN][cC][lL][oO][sS][eE][sS] | [eE][nN][cC][lL][oO][sS][eE][sS]
INTEGER = 0 | [1-9][0-9]*
FLOAT = [0-9]+ "." [0-9]+
SLASH = "/"
%state STRING2
%%
<YYINITIAL>\" {
yybegin(STRING2);
sb.setLength(0);
}
<STRING2> {
\\\" {
sb.append("\"");
}
{QUOTEDSTRING} {
sb.append(yytext());
}
\" {
yybegin(YYINITIAL);
yylval = sb.toString();
return QUOTEDSTRING;
}
}
<YYINITIAL>{NL} {
return NL;
}
<YYINITIAL>" "|\t {
}
<YYINITIAL>{FLOAT} {
yylval = Double.parseDouble(yytext());
return FLOAT;
}
<YYINITIAL>{INTEGER} {
yylval = Long.parseLong(yytext());
return INTEGER;
}
<YYINITIAL>{NAMEDCOMPARITORS} {
yylval = yytext();
return NAMEDCOMPARITORS;
}
<YYINITIAL>{GE} {
yylval = yytext();
return GE;
}
<YYINITIAL>{LE} {
yylval = yytext();
return LE;
}
<YYINITIAL>{NE} {
yylval = yytext();
return NE;
}
<YYINITIAL>{EXACT} {
yylval = yytext();
return EXACT;
}
<YYINITIAL>{GT} {
yylval = yytext();
return GT;
}
<YYINITIAL>{LT} {
yylval = yytext();
return LT;
}
<YYINITIAL>{EQ} {
yylval = yytext();
return EQ;
}
<YYINITIAL>{AND} {
yylval = yytext();
return AND;
}
<YYINITIAL>{OR} {
yylval = yytext();
return OR;
}
<YYINITIAL>{NOT} {
yylval = yytext();
return NOT;
}
<YYINITIAL>{PROX} {
yylval = yytext();
return PROX;
}
<YYINITIAL>{SORTBY} {
yylval = yytext();
return SORTBY;
}
<YYINITIAL>{SIMPLESTRING} {
yylval = yytext();
return SIMPLESTRING;
}
<YYINITIAL>{LPAR} {
yylval = yytext();
return LPAR;
}
<YYINITIAL>{RPAR} {
yylval = yytext();
return RPAR;
}
<YYINITIAL>{SLASH} {
yylval = yytext();
return SLASH;
}

View file

@ -0,0 +1,60 @@
package org.xbib.cql;
import org.junit.Assert;
import org.junit.Test;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.LineNumberReader;
import java.nio.charset.StandardCharsets;
/**
*
*/
public class QueryTest extends Assert {
@Test
public void testValidQueries() throws IOException {
test("queries.txt");
}
private void test(String path) throws IOException {
int count = 0;
int ok = 0;
int errors = 0;
LineNumberReader lr = new LineNumberReader(new InputStreamReader(getClass().getResourceAsStream(path),
StandardCharsets.UTF_8));
String line;
while ((line = lr.readLine()) != null) {
if (line.trim().length() > 0 && !line.startsWith("#")) {
try {
int pos = line.indexOf('|');
if (pos > 0) {
validate(line.substring(0, pos), line.substring(pos + 1));
} else {
validate(line);
}
ok++;
} catch (Exception e) {
errors++;
}
count++;
}
}
lr.close();
assertEquals(errors, 0);
assertEquals(ok, count);
}
private void validate(String line) throws Exception {
CQLParser parser = new CQLParser(line);
parser.parse();
assertEquals(line, parser.getCQLQuery().toString());
}
private void validate(String line, String expected) throws Exception {
CQLParser parser = new CQLParser(line);
parser.parse();
assertEquals(expected, parser.getCQLQuery().toString());
}
}

View file

@ -0,0 +1,128 @@
package org.xbib.cql.elasticsearch;
import org.junit.Assert;
import org.junit.Test;
import org.xbib.cql.CQLParser;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.LineNumberReader;
import java.nio.charset.StandardCharsets;
/**
*
*/
public class ElasticsearchQueryTest extends Assert {
@Test
public void testValidQueries() throws IOException {
test("queries.txt");
}
@Test
public void testSimpleTermFilter() throws Exception {
String cql = "Jörg";
CQLParser parser = new CQLParser(cql);
parser.parse();
ElasticsearchFilterGenerator generator = new ElasticsearchFilterGenerator();
parser.getCQLQuery().accept(generator);
String json = generator.getResult().string();
//logger.info("{} --> {}", cql, json);
assertEquals(json, "{\"term\":{\"cql.allIndexes\":\"Jörg\"}}");
}
@Test
public void testFieldTermFilter() throws Exception {
String cql = "dc.type = electronic";
CQLParser parser = new CQLParser(cql);
parser.parse();
ElasticsearchFilterGenerator generator = new ElasticsearchFilterGenerator();
parser.getCQLQuery().accept(generator);
String json = generator.getResult().string();
//logger.info("{} --> {}", cql, json);
assertEquals(json, "{\"query\":{\"term\":{\"dc.type\":\"electronic\"}}}");
}
@Test
public void testDoubleFieldTermFilter() throws Exception {
String cql = "dc.type = electronic and dc.date = 2013";
CQLParser parser = new CQLParser(cql);
parser.parse();
ElasticsearchFilterGenerator generator = new ElasticsearchFilterGenerator();
parser.getCQLQuery().accept(generator);
String json = generator.getResult().string();
//logger.info("{} --> {}", cql, json);
assertEquals(
"{\"query\":{\"bool\":{\"must\":[{\"term\":{\"dc.type\":\"electronic\"}},{\"term\":{\"dc.date\":\"2013\"}}]}}}",
json
);
}
@Test
public void testTripleFieldTermFilter() throws Exception {
String cql = "dc.format = online and dc.type = electronic and dc.date = 2013";
CQLParser parser = new CQLParser(cql);
parser.parse();
ElasticsearchFilterGenerator generator = new ElasticsearchFilterGenerator();
parser.getCQLQuery().accept(generator);
String json = generator.getResult().string();
//logger.info("{} --> {}", cql, json);
assertEquals(
"{\"query\":{\"bool\":{\"must\":[{\"bool\":{\"must\":[{\"term\":{\"dc.format\":\"online\"}}," +
"{\"term\":{\"dc.type\":\"electronic\"}}]}},{\"term\":{\"dc.date\":\"2013\"}}]}}}",
json);
}
@Test
public void testBoost() throws Exception {
String cql = "Jörg";
CQLParser parser = new CQLParser(cql);
parser.parse();
ElasticsearchQueryGenerator generator = new ElasticsearchQueryGenerator();
generator.setBoostParams("boost", "log2p", 2.0f, "sum");
parser.getCQLQuery().accept(generator);
String json = generator.getSourceResult();
assertEquals(
"{\"from\":0,\"size\":10,\"query\":{\"function_score\":{\"field_value_factor\":{\"field\":\"boost\"," +
"\"modifier\":\"log2p\",\"factor\":2.0},\"boost_mode\":\"sum\"," +
"\"query\":{\"simple_query_string\":{\"query\":\"Jörg\",\"fields\":[\"cql.allIndexes\"]," +
"\"analyze_wildcard\":true,\"default_operator\":\"and\"}}}}}",
json);
}
private void test(String path) throws IOException {
int count = 0;
int ok = 0;
int errors = 0;
LineNumberReader lr = new LineNumberReader(new InputStreamReader(getClass().getResourceAsStream(path),
StandardCharsets.UTF_8));
String line;
while ((line = lr.readLine()) != null) {
if (line.trim().length() > 0 && !line.startsWith("#")) {
try {
int pos = line.indexOf('|');
if (pos > 0) {
validate(line.substring(0, pos), line.substring(pos + 1));
ok++;
}
} catch (Exception e) {
errors++;
}
count++;
}
}
lr.close();
assertEquals(0, errors);
assertEquals(count, ok);
}
private void validate(String cql, String expected) throws Exception {
CQLParser parser = new CQLParser(cql);
parser.parse();
ElasticsearchQueryGenerator generator = new ElasticsearchQueryGenerator();
parser.getCQLQuery().accept(generator);
String elasticsearchQuery = generator.getSourceResult();
assertEquals(expected, elasticsearchQuery);
}
}

View file

@ -0,0 +1,4 @@
/**
* Classes for Elasticsearch CQL testing.
*/
package org.xbib.cql.elasticsearch;

View file

@ -0,0 +1,4 @@
/**
* Classes for CQL testing.
*/
package org.xbib.cql;

View file

@ -0,0 +1,22 @@
package org.xbib.cql.util;
import static org.junit.Assert.assertEquals;
import org.junit.Test;
/**
*
*/
public class QuotedStringTokenizerTest {
@Test
public void testTokenizer() throws Exception {
String s = "Linux is \"pinguin's best friend\", not Windows";
QuotedStringTokenizer tokenizer = new QuotedStringTokenizer(s);
assertEquals("Linux", tokenizer.nextToken());
assertEquals("is", tokenizer.nextToken());
assertEquals("pinguin's best friend,", tokenizer.nextToken());
assertEquals("not", tokenizer.nextToken());
assertEquals("Windows", tokenizer.nextToken());
}
}

View file

@ -0,0 +1,4 @@
/**
* Classes for CQL utilities.
*/
package org.xbib.cql.util;

View file

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration status="OFF">
<appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="[%d{ABSOLUTE}][%-5p][%-25c][%t] %m%n"/>
</Console>
</appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="Console" />
</Root>
</Loggers>
</configuration>

View file

@ -0,0 +1,123 @@
id = 8a666b7e-6597-3cfb-b478-313cc3c25011|{"from":0,"size":10,"query":{"simple_query_string":{"query":"8a666b7e-6597-3cfb-b478-313cc3c25011","fields":["id"],"analyze_wildcard":true,"default_operator":"and"}}}
unix|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
financing|{"from":0,"size":10,"query":{"simple_query_string":{"query":"financing","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"Christine Wolfinger"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"Christine Wolfinger\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"der die das"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"der die das\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
1234|{"from":0,"size":10,"query":{"simple_query_string":{"query":"1234","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"1234"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"1234\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"unix AND wolfinger"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"unix AND wolfinger\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"to be or not to be"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"to be or not to be\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"not macht erfinderisch"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"not macht erfinderisch\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"to be or not to be"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"to be or not to be\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
unix$|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix$","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
"^linux"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"^linux\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
finan*|{"from":0,"size":10,"query":{"simple_query_string":{"query":"finan*","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
finan?|{"from":0,"size":10,"query":{"simple_query_string":{"query":"finan?","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
finan*ng|{"from":0,"size":10,"query":{"simple_query_string":{"query":"finan*ng","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
finan?ier?ng|{"from":0,"size":10,"query":{"simple_query_string":{"query":"finan?ier?ng","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
title = "duck"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"duck\"","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}}
title = "Dinosaur Systematics"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"Dinosaur Systematics\"","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}}
title <> linux|{"from":0,"size":10,"query":{"bool":{"must_not":{"simple_query_string":{"query":"linux","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}}}}
cql.resultSetId = HT000011990|{"from":0,"size":10,"query":{"simple_query_string":{"query":"HT000011990","fields":["cql.resultSetId"],"analyze_wildcard":true,"default_operator":"and"}}}
cql.allRecords = 2|{"from":0,"size":10,"query":{"simple_query_string":{"query":"2","fields":["cql.allRecords"],"analyze_wildcard":true,"default_operator":"and"}}}
cql.allRecords = 1 NOT title = fish|{"from":0,"size":10,"query":{"bool":{"must_not":[{"simple_query_string":{"query":"1","fields":["cql.allRecords"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"fish","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
title any "unix linux"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix linux","fields":["title"],"analyze_wildcard":true,"default_operator":"or"}}}
title all "unix linux"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix linux","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}}
title all "unix 'linux' test"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix 'linux' test","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}}
title all "linux \"pinguin's best friend\" unix"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"linux \"pinguin's best friend\" unix","fields":["title"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.title adj "lord of the rings"|{"from":0,"size":10,"query":{"match_phrase":{"dc.title":{"query":"lord of the rings","slop":0}}}}
anywhere = "linux unix \"grundkurs für einsteiger\""|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"linux unix \\\"grundkurs für einsteiger\\\"\"","fields":["anywhere"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.date=2003|{"from":0,"size":10,"query":{"simple_query_string":{"query":"2003","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.date="2003"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"2003\"","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.creator=smith|{"from":0,"size":10,"query":{"simple_query_string":{"query":"smith","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.title=financing|{"from":0,"size":10,"query":{"simple_query_string":{"query":"financing","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.subject=financing|{"from":0,"size":10,"query":{"simple_query_string":{"query":"financing","fields":["dc.subject"],"analyze_wildcard":true,"default_operator":"and"}}}
"feathered dinosaur" and (yixian or jehol)|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"feathered dinosaur\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"bool":{"should":[{"simple_query_string":{"query":"yixian","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"jehol","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
(a or b) and (c or d)|{"from":0,"size":10,"query":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"a","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"b","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"should":[{"simple_query_string":{"query":"c","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"d","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
unix AND wolfinger|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"wolfinger","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
"keine angst" AND unix|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"keine angst\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.title=unix or wolfinger|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"unix","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"wolfinger","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
(dc.title = unix or dc.date = 2003) and ( dc.creator = wolfinger and dc.creator = christine or dc.creator = maier )|{"from":0,"size":10,"query":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"unix","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"2003","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"should":[{"bool":{"must":[{"simple_query_string":{"query":"wolfinger","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"christine","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"simple_query_string":{"query":"maier","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
financing AND success|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"financing","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
financing OR monetary|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"financing","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"monetary","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
financing NOT success|{"from":0,"size":10,"query":{"bool":{"must_not":[{"simple_query_string":{"query":"financing","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
(financing AND monetary) OR success|{"from":0,"size":10,"query":{"bool":{"should":[{"bool":{"must":[{"simple_query_string":{"query":"financing","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"monetary","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
financing AND (monetary OR success)|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"financing","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"bool":{"should":[{"simple_query_string":{"query":"monetary","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
"financing constraints" OR success|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"\"financing constraints\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
"financing constraints" NOT model|{"from":0,"size":10,"query":{"bool":{"must_not":[{"simple_query_string":{"query":"\"financing constraints\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"model","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
("financing constraints" AND model) OR success|{"from":0,"size":10,"query":{"bool":{"should":[{"bool":{"must":[{"simple_query_string":{"query":"\"financing constraints\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"model","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
"financing constraints" AND (model OR success)|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"financing constraints\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"bool":{"should":[{"simple_query_string":{"query":"model","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"success","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
dinosaur or bird|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"dinosaur","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"bird","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dino and "eiszeit"|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"dino","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"eiszeit\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dinosaur not reptile|{"from":0,"size":10,"query":{"bool":{"must_not":[{"simple_query_string":{"query":"dinosaur","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"reptile","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "christine" )|{"from":0,"size":10,"query":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"linux\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"must":[{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"christine\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
date = 2007-09-30 or date = "2007-09-30T12:34:56"|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"2007-09-30","fields":["date"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"2007-09-30T12:34:56\"","fields":["date"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dinosaur and bird or dinobird|{"from":0,"size":10,"query":{"bool":{"should":[{"bool":{"must":[{"simple_query_string":{"query":"dinosaur","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"bird","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"simple_query_string":{"query":"dinobird","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
(bird or dinosaur) and (feathers or scales)|{"from":0,"size":10,"query":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"bird","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"dinosaur","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"should":[{"simple_query_string":{"query":"feathers","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"scales","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
linux and creator = wolfinger|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"linux","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"wolfinger","fields":["creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.title=linux and dc.title = unix|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"linux","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"unix","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.title = unix and dc.date = 2000|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"unix","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"2000","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.title = "unix" and dc.creator = "wolfinger"|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.title = "unix" or dc.creator = "wolfinger"|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.title = "unix" and ( dc.creator = "wolfinger" or dc.creator = "meyer" )|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"bool":{"should":[{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"meyer\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
dc.title = "unix" and dc.creator = "wolfinger" and dc.creator = "christine"|{"from":0,"size":10,"query":{"bool":{"must":[{"bool":{"must":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"simple_query_string":{"query":"\"christine\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "meyer" )|{"from":0,"size":10,"query":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"linux\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"must":[{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"meyer\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
dc.title = "foo" and (dc.creator = "smith" or dc.creator = "jones")|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"foo\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"bool":{"should":[{"simple_query_string":{"query":"\"smith\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"jones\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}
dc.creator = "smith" and dc.creator = "jones"|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"smith\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"jones\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
dc.date = 2007-09-30 or dc.date = "2007-09-30T12:34:56"|{"from":0,"size":10,"query":{"bool":{"should":[{"simple_query_string":{"query":"2007-09-30","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"2007-09-30T12:34:56\"","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}]}}}
identifier = 0783923126590|{"from":0,"size":10,"query":{"simple_query_string":{"query":"0783923126590","fields":["identifier"],"analyze_wildcard":true,"default_operator":"and"}}}
identifier = "9783923126590"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"9783923126590\"","fields":["identifier"],"analyze_wildcard":true,"default_operator":"and"}}}
identifier = "9783923126590*"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"9783923126590*\"","fields":["identifier"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.identifier =/bib.identifierAuthority=isbn "0201563177"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"0201563177\"","fields":["bib.identifierAuthority=isbn"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.identifier =/bib.identifierAuthority=isbn "0201563177" and dc.title=unix sortby dc.date|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"\"0201563177\"","fields":["bib.identifierAuthority=isbn"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"unix","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}]}},"sort":[{"dc.date":{"unmapped_type":"string","missing":"_last"}}]}
dc.date > 2007-09-30 and dc.date < "2007-10-30T12:34:56"|{"from":0,"size":10,"query":{"bool":{"must":[{"range":{"dc.date":{"from":"2007-09-30","include_lower":false}}},{"range":{"dc.date":{"to":"\"2007-10-30T12:34:56\"","include_upper":false}}}]}}}
date > 2007-01-01|{"from":0,"size":10,"query":{"range":{"date":{"from":"2007-01-01","include_lower":false}}}}
dc.date <= 2006-07-01|{"from":0,"size":10,"query":{"range":{"dc.date":{"to":"2006-07-01","include_upper":true}}}}
dc.date >= 2005-02-31|{"from":0,"size":10,"query":{"range":{"dc.date":{"from":"2005-02-31","include_lower":true}}}}
dc.date > 2011|{"from":0,"size":10,"query":{"range":{"dc.date":{"from":"2011","include_lower":false}}}}
dc.date = "> 2003"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"> 2003\"","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}}
dc.date = "20012010"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"20012010\"","fields":["dc.date"],"analyze_wildcard":true,"default_operator":"and"}}}
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "meyer" ) and filter.subject = "computer"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"linux\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"must":[{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"meyer\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}},"filter":{"term":{"subject":"computer"}}}}}
unix and filter.date > 2006-01-01|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"range":{"date":{"from":"2006-01-01","include_lower":false}}}}}}
unix and (filter.date > 2006-01-01 and filter.date > 2007-01-01)|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"range":{"date":{"from":"2006-01-01","include_lower":false}}}}}}
unix and dc.date within "2006 2007"|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"range":{"dc.date":{"from":"2006","to":"2007","include_lower":true,"include_upper":true}}}]}}}
unix and dc.date within "2006-01-01 2007-01-01"|{"from":0,"size":10,"query":{"bool":{"must":[{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},{"range":{"dc.date":{"from":"2006-01-01","to":"2007-01-01","include_lower":true,"include_upper":true}}}]}}}
unix and filter.date within "2006-01-01 2007-01-01"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"range":{"date":{"from":"2006-01-01","to":"2007-01-01","include_lower":true,"include_upper":true}}}}}}
dc.title = "unix" and filter.creator = "wolfinger"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"creator":"wolfinger"}}}}}
dc.title = "unix" and filter.creator = "wolfinger" or filter.creator = "meyer"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"should":{"bool":{"must":{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}}}}},"filter":{"bool":{"should":[{"term":{"creator":"wolfinger"}},{"term":{"creator":"meyer"}}]}}}}}
dc.title = "unix" and (filter.creator = "wolfinger" and filter.subject= Computer)|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"bool":{"must":[{"term":{"creator":"wolfinger"}},{"term":{"subject":"Computer"}}]}}}}}
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "meyer" ) and filter.subject = "computer"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"bool":{"must":[{"bool":{"should":[{"simple_query_string":{"query":"\"unix\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"linux\"","fields":["dc.title"],"analyze_wildcard":true,"default_operator":"and"}}]}},{"bool":{"must":[{"simple_query_string":{"query":"\"wolfinger\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}},{"simple_query_string":{"query":"\"meyer\"","fields":["dc.creator"],"analyze_wildcard":true,"default_operator":"and"}}]}}]}}}},"filter":{"term":{"subject":"computer"}}}}}
test and (filter.creator = "a" and filter.subject = "b")|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"bool":{"must":[{"term":{"creator":"a"}},{"term":{"subject":"b"}}]}}}}}
test and filter.creator = "a" or filter.subject = "b"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"should":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}},"filter":{"bool":{"should":[{"term":{"creator":"a"}},{"term":{"subject":"b"}}]}}}}}
test and filter.creator = "smith"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"creator":"smith"}}}}}
test and filter.creator = "smith" or filter.creator = "jones"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"should":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}},"filter":{"bool":{"should":[{"term":{"creator":"smith"}},{"term":{"creator":"jones"}}]}}}}}
test and (filter.creator = "smith" and filter.creator = "jones")|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"creator":"smith"}}}}}
test or filter.creator = "smith" and filter.creator = "jones"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"bool":{"should":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}},"filter":{"bool":{"should":[{"term":{"creator":"jones"}},{"term":{"creator":"smith"}}]}}}}}
test or (filter.creator = "smith" and filter.creator = "jones")|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"should":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"creator":"smith"}}}}}
test and (filter.creator = "smith" or filter.creator = "jones")|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"bool":{"should":[{"term":{"creator":"smith"}},{"term":{"creator":"jones"}}]}}}}}
test or (filter.creator = "smith" or filter.creator = "jones")|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"should":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"bool":{"should":[{"term":{"creator":"smith"}},{"term":{"creator":"jones"}}]}}}}}
test and (filter.creator = "smith" or filter.creator = "jones" and filter.subject = "unix")|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"bool":{"should":[{"bool":{"must":[{"term":{"creator":"smith"}},{"term":{"subject":"unix"}}]}},{"term":{"creator":"jones"}}]}}}}}
structure AND filter.creator="smith"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"structure","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"creator":"smith"}}}}}
structure AND filter.subject="data"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"structure","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"subject":"data"}}}}}
structure AND filter.date="2003"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"structure","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"date":"2003"}}}}}
pädagogik AND filter.taxonomy="0/24/*"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"pädagogik","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"taxonomy":"0/24/"}}}}}
pädagogik AND filter.taxonomy="0/24/313/*"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"pädagogik","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"taxonomy":"0/24/313/"}}}}}
pädagogik AND filter.taxonomy="0/24/313/21/*"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"pädagogik","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"term":{"taxonomy":"0/24/313/21/"}}}}}
linux and filter.creator <> "Wolfinger"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"linux","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"not":{"term":{"creator":"Wolfinger"}}}}}}
unix and option.offset = 10 and option.length = 20|{"from":0,"size":10,"query":{"bool":{"must":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}}}
test and option.length = 1 and option.length = 2 and option.length = 3|{"from":0,"size":10,"query":{"bool":{"must":{"bool":{"must":{"bool":{"must":{"simple_query_string":{"query":"test","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}}}}}
unix sortby date|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},"sort":[{"date":{"unmapped_type":"string","missing":"_last"}}]}
unix sortby date/sort.descending|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},"sort":[{"date":{"order":"desc","unmapped_type":"string","missing":"_last"}}]}
unix sortby date/sort.descending geo/sort.ascending|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},"sort":[{"date":{"order":"desc","unmapped_type":"string","missing":"_last"}}]}
unix sortby geo/sort.ascending/sort.unit=km/sort.lat=50.9415016174/sort.lon=6.95853996277|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},"sort":[{"geo":{"order":"asc","sort.unit":"km","sort.lat":"50.9415016174","sort.lon":"6.95853996277","unmapped_type":"string","missing":"_last"}}]}
unix sortby geo/sort.ascending/sort.unit=km/sort.center="(50.9415016174,6.95853996277)"|{"from":0,"size":10,"query":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}},"sort":[{"geo":{"order":"asc","sort.unit":"km","sort.center":"\"(50.9415016174,6.95853996277)\"","unmapped_type":"string","missing":"_last"}}]}
bib.namePersonal = meier|{"from":0,"size":10,"query":{"simple_query_string":{"query":"meier","fields":["bib.namePersonal"],"analyze_wildcard":true,"default_operator":"and"}}}
unix and filter.location any "DE-929 DE-107 DE-Zw1"|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"or":[{"term":{"location":"DE-929 DE-107 DE-Zw1"}}]}}}}
unix and filter.location any "DE-929 DE-107 DE-Zw1" sortby date/sort.descending|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"filter":{"or":[{"term":{"location":"DE-929 DE-107 DE-Zw1"}}]}}},"sort":[{"date":{"order":"desc","unmapped_type":"string","missing":"_last"}}]}
unix and option.offset = 10 and option.length = 20 and filter.location any "DE-929 DE-107 DE-Zw1" sortby date/sort.descending|{"from":0,"size":10,"query":{"filtered":{"query":{"bool":{"must":{"bool":{"must":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}}}},"filter":{"or":[{"term":{"location":"DE-929 DE-107 DE-Zw1"}}]}}},"sort":[{"date":{"order":"desc","unmapped_type":"string","missing":"_last"}}]}
unix and facet.creator = "on"|{"from":0,"size":10,"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"aggregations":{"myfacet":"myvalue"}}
unix and facet.creator = "off"|{"from":0,"size":10,"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"aggregations":{"myfacet":"myvalue"}}
unix and facet.creator = "on" and facet.subject = "on" and facet.date = "off"|{"from":0,"size":10,"query":{"bool":{"must":{"bool":{"must":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}}}}}},"aggregations":{"myfacet":"myvalue"}}
unix and facet.date = on|{"from":0,"size":10,"query":{"bool":{"must":{"simple_query_string":{"query":"unix","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}},"aggregations":{"myfacet":"myvalue"}}
(cql.allIndexes = "")|{"from":0,"size":10,"query":{"simple_query_string":{"query":"\"\"","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}
cql.allIndexes all 3125294126|{"from":0,"size":10,"query":{"simple_query_string":{"query":"3125294126","fields":["cql.allIndexes"],"analyze_wildcard":true,"default_operator":"and"}}}

View file

@ -0,0 +1,141 @@
"to be or not to be"
publicationYear < 1980
lengthOfFemur > 2.4
bioMass >= 100
id = 12345678
id = 8a666b7e-6597-3cfb-b478-313cc3c25011
contentid = "0a1af248-7339-3b59-bc07-3a460275456f"
isbn = "0818631678"
title = "duck" and author = "sanderson"
unix
financing
"Christine Wolfinger"
"der die das"
1234
"1234"
1.234
"1.234"
"unix AND wolfinger"
"to be or not to be"
"not macht erfinderisch"
"to be or not to be"
unix$
"^linux"
finan*
finan?
finan*ng
finan?ier?ng
title = "duck"
title = "Dinosaur Systematics"
title <> linux
cql.resultSetId = HT000011990
cql.allRecords = 2
cql.allRecords = 1 NOT title = fish|cql.allRecords = 1 not title = fish
title any "unix linux"
title all "unix linux"
title all "unix 'linux' test"
title all "linux \"pinguin's best friend\" unix"
dc.title adj "lord of the rings"
anywhere = "linux unix \"grundkurs für einsteiger\""
dc.date=2003|dc.date = 2003
dc.date="2003"|dc.date = "2003"
dc.creator=smith|dc.creator = smith
dc.title=financing|dc.title = financing
dc.subject=financing|dc.subject = financing
"feathered dinosaur" and (yixian or jehol)
(a or b) and (c or d)
unix AND wolfinger|unix and wolfinger
"keine angst" AND unix|"keine angst" and unix
unix and 2012
dc.title=unix or wolfinger|dc.title = unix or wolfinger
(dc.title = unix or dc.date = 2003) and ( dc.creator = wolfinger and dc.creator = christine or dc.creator = maier )|(dc.title = unix or dc.date = 2003) and (dc.creator = wolfinger and dc.creator = christine or dc.creator = maier)
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "christine" )|(dc.title = "unix" or dc.title = "linux") and (dc.creator = "wolfinger" and dc.creator = "christine")
financing AND success|financing and success
financing OR monetary|financing or monetary
financing NOT success|financing not success
(financing AND monetary) OR success|(financing and monetary) or success
financing AND (monetary OR success)|financing and (monetary or success)
"financing constraints" OR success|"financing constraints" or success
"financing constraints" NOT model|"financing constraints" not model
("financing constraints" AND model) OR success|("financing constraints" and model) or success
"financing constraints" AND (model OR success)|"financing constraints" and (model or success)
dinosaur or bird
dino and "eiszeit"
dinosaur not reptile
date = 2007-09-30 or date = "2007-09-30T12:34:56"
dinosaur and bird or dinobird
(bird or dinosaur) and (feathers or scales)
linux and creator = wolfinger
dc.title=linux and dc.title = unix|dc.title = linux and dc.title = unix
dc.title = unix and dc.date = 2000
dc.title = "unix" and dc.creator = "wolfinger"
dc.title = "unix" or dc.creator = "wolfinger"
dc.title = "unix" and dc.creator = "wolfinger" and dc.creator = "christine"
dc.title = "unix" and ( dc.creator = "wolfinger" or dc.creator = "meyer" )|dc.title = "unix" and (dc.creator = "wolfinger" or dc.creator = "meyer")
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "meyer" )|(dc.title = "unix" or dc.title = "linux") and (dc.creator = "wolfinger" and dc.creator = "meyer")
dc.title = "foo" and (dc.creator = "smith" or dc.creator = "jones")
dc.creator = "smith" and dc.creator = "jones"
dc.date = 2007-09-30 or dc.date = "2007-09-30T12:34:56"
identifier = 0783923126590
identifier = "9783923126590"
identifier = "9783923126590*"
dc.identifier=/bib.identifierAuthority=isbn "0201563177"|dc.identifier =/bib.identifierAuthority=isbn "0201563177"
dc.identifier =/bib.identifierAuthority=isbn "0201563177"|dc.identifier =/bib.identifierAuthority=isbn "0201563177"
dc.identifier =/bib.identifierAuthority=isbn "0201563177" and dc.title=unix sortby date|dc.identifier =/bib.identifierAuthority=isbn "0201563177" and dc.title = unix sortby date
dc.date > 2007-09-30 and dc.date < "2007-10-30T12:34:56"
date > 2007-01-01
dc.date <= 2006-07-01
dc.date >= 2005-02-31
dc.date within "2006-01-01 2007-01-01"
dc.date > 2011
dc.date = "> 2003"
dc.date = "20012010"
test and filter.collection = "test"|test
dc.title = test and filter.collection = "test"|dc.title = test
(dc.title = "unix" or dc.title = "linux") and ( dc.creator = "wolfinger" and dc.creator = "meyer" ) and filter.subject = "computer"|(dc.title = "unix" or dc.title = "linux") and (dc.creator = "wolfinger" and dc.creator = "meyer")
dc.title = "unix" and filter.creator = "wolfinger"|dc.title = "unix"
dc.title = "unix" and filter.creator = "wolfinger" or filter.creator = "meyer"|dc.title = "unix"
dc.title = "unix" and (filter.creator = "wolfinger" and filter.subject= Computer)|dc.title = "unix"
unix and filter.date > 2006-01-01|unix
unix and (filter.date > 2006-01-01 and filter.date > 2007-01-01)|unix
unix and filter.date within "2006-01-01 2007-01-01"|unix
unix and filter.collection = "info:sid/a.b.c.d:module"|unix
unix and filter.collection = "info:sid/a.b.c.d:module" or filter.collection = "info:sid/e.f.g.h:module"|unix
unix and (filter.collection = "info:sid/a.b.c.d:module" and filter.creator ="Wolfinger, Christine")|unix
test and filter.collection = "test"|test
test and (filter.creator = "a" and filter.subject = "b")|test
test and filter.creator = "a" or filter.subject = "b"|test
test and filter.creator = "smith"|test
test and (filter.creator = "jones" and filter.collection = "test")|test
test and filter.creator = "smith" or filter.creator = "jones"|test
test and (filter.creator = "smith" and filter.creator = "jones")|test
test or filter.creator = "smith" and filter.creator = "jones"|test
test or (filter.creator = "smith" and filter.creator = "jones")|test
test and (filter.creator = "smith" or filter.creator = "jones")|test
test or (filter.creator = "smith" or filter.creator = "jones")|test
test and (filter.creator = "smith" or filter.creator = "jones" and filter.subject = "unix")|test
structure AND filter.creator="smith"|structure
structure AND filter.subject="data"|structure
structure AND filter.date="2003"|structure
pädagogik AND filter.taxonomy="0/24/*"|pädagogik
pädagogik AND filter.taxonomy="0/24/313/*"|pädagogik
pädagogik AND filter.taxonomy="0/24/313/21/*"|pädagogik
linux and filter.creator <> "Wolfinger"|linux
unix and option.offset = 10 and option.length = 20|unix
test and option.length = 1 and option.length = 2 and option.length = 3|test
bib.namePersonal = meier
unix sortby date/sort.descending
unix sortby date/sort.descending geo/sort.ascending
unix sortby geo/sort.ascending/sort.unit=km/sort.lat=50.9415016174/sort.lon=6.95853996277
unix sortby geo/sort.ascending/sort.unit=km/sort.center="(50.9415016174,6.95853996277)"
unix and filter.location any "DE-929 DE-107 DE-Zw1"|unix
unix and filter.location any "DE-929 DE-107 DE-Zw1" sortby date/sort.descending|unix sortby date/sort.descending
unix and option.offset = 10 and option.length = 20 and filter.location any "DE-929 DE-107 DE-Zw1" sortby date/sort.descending|unix sortby date/sort.descending
unix and facet.dc.creator = "on"|unix
unix and facet.dc.creator = "off"|unix
unix and facet.dc.creator = "on" and facet.dc.subject = "on" and facet.dc.date = "off"|unix
unix and facet.dc.date = on|unix
unix and facet.dc.creator = "on" and facet.dc.subject = "on" and facet.dc.subject = "buckets=10"|unix
unix and facet.dc.date = "on" and facet.dc.subject = "on" and facet.dc.subject = "buckets=20"|unix
unix and facet.dc.creator = "on" and facet.dc.subject = "on" and facet.dc.subject = "buckets=20"|unix
cql.allIndexes all "linux;"