Skip to content

SQL: Update branch to include the fields API master changes #68831

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 26 commits into from
Feb 10, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
20cab22
Disable BWC for backporting ILM partial searchable snapshot support (…
dakrone Feb 9, 2021
c656159
[DOCS] Expand simple query string query's multi-position token sectio…
jrodewig Feb 9, 2021
f1d8ded
Adjust version serialization and re-enable BWC after backport (#68781)
dakrone Feb 9, 2021
b914994
Fix Javadoc issue for JDKs <15
DaveCTurner Feb 9, 2021
7cce070
Add StepListener#addListener (#68770)
DaveCTurner Feb 10, 2021
a7abc0a
Add more trace logging when installing monitor watches and (#68752)
martijnvg Feb 10, 2021
ab26062
SQL: Enhance error message on filtering check against aggs (#68763)
bpintea Feb 10, 2021
2bc171a
Remove Dead Test Code Branch in SparseFileTrackerTests (#68801)
original-brownbear Feb 10, 2021
bae65dd
build-tools check are fixed on aarch64 (#68630)
breskeby Feb 10, 2021
ee5cc54
QL: "fields" api implementation in QL (#68802)
astefan Feb 10, 2021
71d43b5
Refactor usage of compatible version (#68648)
pgomulka Feb 10, 2021
1e29fb3
Fix testListenersNotifiedOnCorrectThreads (#68805)
DaveCTurner Feb 10, 2021
0c1d799
Fix testDeleteActionDeletesSearchableSnapshot (#68751)
andreidan Feb 10, 2021
114c396
Make GET _cluster/stats cancellable (#68676)
DaveCTurner Feb 10, 2021
3b249fa
Use JNA to Speed up Snapshot Cache File Creation (#68687)
original-brownbear Feb 10, 2021
414fabb
Remove the version 8.0.0 "restriction" (#68822)
astefan Feb 10, 2021
2b95858
SQL: Fix the MINUTE_OF_DAY() function that throws exception when used…
Feb 10, 2021
b7c089a
Add 7.11.1 to BWC versions
pugnascotia Feb 10, 2021
b7d178d
Remove 7.10.3 after 7.11.0 release
pugnascotia Feb 10, 2021
c9a5f43
Merge remote-tracking branch 'upstream/master' into feat/fields_api_u…
bpintea Feb 10, 2021
0b48482
Replace forbidden apis
bpintea Feb 10, 2021
c35eebe
Scripting: capture structured javadoc from stdlib (#68782)
stu-elastic Feb 10, 2021
b61556c
Style fix
bpintea Feb 10, 2021
3c6437f
Add max_single_primary_size to ResizeRequest's toXContent (#68793)
joegallo Feb 10, 2021
6d5ab2d
Reject remounting snapshot of a searchable snapshot (#68816)
DaveCTurner Feb 10, 2021
3f2c6ed
Merge remote-tracking branch 'upstream/master' into feat/fields_api_u…
bpintea Feb 10, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .ci/bwcVersions
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ BWC_VERSION:
- "7.10.0"
- "7.10.1"
- "7.10.2"
- "7.10.3"
- "7.11.0"
- "7.11.1"
- "7.12.0"
- "8.0.0"
Original file line number Diff line number Diff line change
Expand Up @@ -51,19 +51,19 @@ class DistributionDownloadPluginFuncTest extends AbstractGradleFuncTest {
when:
def guh = new File(testProjectDir.getRoot(), "gradle-user-home").absolutePath;
def runner = gradleRunner('clean', 'setupDistro', '-i', '-g', guh)
def unpackingMessage = "Unpacking elasticsearch-${version}-linux-${Architecture.current().classifier}.tar.gz " +
"using SymbolicLinkPreservingUntarTransform"
def result = withMockedDistributionDownload(version, platform, runner) {
// initial run
def firstRun = build()
assertOutputContains(firstRun.output, "Unpacking elasticsearch-${version}-linux-x86_64.tar.gz " +
"using SymbolicLinkPreservingUntarTransform")
assertOutputContains(firstRun.output, unpackingMessage)
// 2nd invocation
build()
}

then:
result.task(":setupDistro").outcome == TaskOutcome.SUCCESS
assertOutputMissing(result.output, "Unpacking elasticsearch-${version}-linux-x86_64.tar.gz " +
"using SymbolicLinkPreservingUntarTransform")
assertOutputMissing(result.output, unpackingMessage)
}

def "transforms are reused across projects"() {
Expand Down Expand Up @@ -100,7 +100,7 @@ class DistributionDownloadPluginFuncTest extends AbstractGradleFuncTest {

then:
result.tasks.size() == 3
result.output.count("Unpacking elasticsearch-${version}-linux-x86_64.tar.gz " +
result.output.count("Unpacking elasticsearch-${version}-linux-${Architecture.current().classifier}.tar.gz " +
"using SymbolicLinkPreservingUntarTransform") == 1
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,14 +29,16 @@ class JdkDownloadPluginFuncTest extends AbstractGradleFuncTest {

private static final String OPENJDK_VERSION_OLD = "1+99"
private static final String ADOPT_JDK_VERSION = "12.0.2+10"
private static final String ADOPT_JDK_VERSION_11 = "11.0.10+9"
private static final String ADOPT_JDK_VERSION_15 = "15.0.2+7"
private static final String OPEN_JDK_VERSION = "12.0.1+99@123456789123456789123456789abcde"
private static final String AZUL_AARCH_VERSION = "15.0.1+99@123456789123456789123456789abcde"
private static final Pattern JDK_HOME_LOGLINE = Pattern.compile("JDK HOME: (.*)");

@Unroll
def "jdk #jdkVendor for #platform#suffix are downloaded and extracted"() {
given:
def mockRepoUrl = urlPath(jdkVendor, jdkVersion, platform);
def mockRepoUrl = urlPath(jdkVendor, jdkVersion, platform, arch);
def mockedContent = filebytes(jdkVendor, platform)
buildFile.text = """
plugins {
Expand Down Expand Up @@ -70,20 +72,22 @@ class JdkDownloadPluginFuncTest extends AbstractGradleFuncTest {
assertExtraction(result.output, expectedJavaBin);

where:
platform | arch | jdkVendor | jdkVersion | expectedJavaBin | suffix
"linux" | "x64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION | "bin/java" | ""
"linux" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "bin/java" | ""
"linux" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "bin/java" | "(old version)"
"windows" | "x64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION | "bin/java" | ""
"windows" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "bin/java" | ""
"windows" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "bin/java" | "(old version)"
"darwin" | "x64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION | "Contents/Home/bin/java" | ""
"darwin" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "Contents/Home/bin/java" | ""
"darwin" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "Contents/Home/bin/java" | "(old version)"
"mac" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "Contents/Home/bin/java" | ""
"mac" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "Contents/Home/bin/java" | "(old version)"
"darwin" | "aarch64" | VENDOR_AZUL | AZUL_AARCH_VERSION | "Contents/Home/bin/java" | ""
"linux" | "aarch64" | VENDOR_AZUL | AZUL_AARCH_VERSION | "bin/java" | ""
platform | arch | jdkVendor | jdkVersion | expectedJavaBin | suffix
"linux" | "x64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION | "bin/java" | ""
"linux" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "bin/java" | ""
"linux" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "bin/java" | "(old version)"
"windows" | "x64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION | "bin/java" | ""
"windows" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "bin/java" | ""
"windows" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "bin/java" | "(old version)"
"darwin" | "x64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION | "Contents/Home/bin/java" | ""
"darwin" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "Contents/Home/bin/java" | ""
"darwin" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "Contents/Home/bin/java" | "(old version)"
"mac" | "x64" | VENDOR_OPENJDK | OPEN_JDK_VERSION | "Contents/Home/bin/java" | ""
"mac" | "x64" | VENDOR_OPENJDK | OPENJDK_VERSION_OLD | "Contents/Home/bin/java" | "(old version)"
"darwin" | "aarch64" | VENDOR_AZUL | AZUL_AARCH_VERSION | "Contents/Home/bin/java" | ""
"linux" | "aarch64" | VENDOR_AZUL | AZUL_AARCH_VERSION | "bin/java" | ""
"linux" | "aarch64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION_11 | "bin/java" | "(jdk 11)"
"linux" | "aarch64" | VENDOR_ADOPTOPENJDK | ADOPT_JDK_VERSION_15 | "bin/java" | "(jdk 15)"
}

def "transforms are reused across projects"() {
Expand Down Expand Up @@ -195,10 +199,13 @@ class JdkDownloadPluginFuncTest extends AbstractGradleFuncTest {
true
}

private static String urlPath(final String vendor, final String version, final String platform) {
private static String urlPath(final String vendor,
final String version,
final String platform,
final String arch = 'x64') {
if (vendor.equals(VENDOR_ADOPTOPENJDK)) {
final String module = isMac(platform) ? "mac" : platform;
return "/jdk-12.0.2+10/" + module + "/x64/jdk/hotspot/normal/adoptopenjdk";
return "/jdk-" + version + "/" + module + "/${arch}/jdk/hotspot/normal/adoptopenjdk";
} else if (vendor.equals(VENDOR_OPENJDK)) {
final String effectivePlatform = isMac(platform) ? "osx" : platform;
final boolean isOld = version.equals(OPENJDK_VERSION_OLD);
Expand All @@ -208,7 +215,7 @@ class JdkDownloadPluginFuncTest extends AbstractGradleFuncTest {
} else if (vendor.equals(VENDOR_AZUL)) {
final String module = isMac(platform) ? "macosx" : platform;
// we only test zulu 15 darwin aarch64 for now
return "/zulu${module.equals('linux') ? '-embedded' : ''}/bin/zulu15.29.15-ca-jdk15.0.2-${module}_aarch64.tar.gz";
return "/zulu${module.equals('linux') ? '-embedded' : ''}/bin/zulu15.29.15-ca-jdk15.0.2-${module}_${arch}.tar.gz";
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,7 @@ class DistributionDownloadFixture {
private static String urlPath(String version,ElasticsearchDistribution.Platform platform) {
String fileType = ((platform == ElasticsearchDistribution.Platform.LINUX ||
platform == ElasticsearchDistribution.Platform.DARWIN)) ? "tar.gz" : "zip"
String arch = Architecture.current() == Architecture.AARCH64 ? "aarch64" : "x86_64"
"/downloads/elasticsearch/elasticsearch-${version}-${platform}-${arch}.$fileType"
"/downloads/elasticsearch/elasticsearch-${version}-${platform}-${Architecture.current().classifier}.$fileType"
}

private static byte[] filebytes(String urlPath) throws IOException {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,11 @@

package org.elasticsearch.gradle.internal

import org.elasticsearch.gradle.Architecture
import org.elasticsearch.gradle.VersionProperties
import org.elasticsearch.gradle.fixtures.AbstractGradleFuncTest
import org.gradle.testkit.runner.GradleRunner
import org.gradle.testkit.runner.TaskOutcome
import org.junit.Rule
import org.junit.rules.TemporaryFolder

import java.lang.management.ManagementFactory

class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest {

Expand Down Expand Up @@ -61,7 +58,7 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest
def result = gradleRunner("setupDistro", '-g', testProjectDir.newFolder('GUH').path).build()

then:
result.task(":distribution:archives:linux-tar:buildExpanded").outcome == TaskOutcome.SUCCESS
result.task(":distribution:archives:${testArchiveProjectName}:buildExpanded").outcome == TaskOutcome.SUCCESS
result.task(":setupDistro").outcome == TaskOutcome.SUCCESS
assertExtractedDistroIsCreated("build/distro", 'current-marker.txt')
}
Expand Down Expand Up @@ -133,24 +130,24 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest
apply plugin:'base'

// packed distro
configurations.create("linux-tar")
configurations.create("${testArchiveProjectName}")
tasks.register("buildBwcTask", Tar) {
from('bwc-marker.txt')
archiveExtension = "tar.gz"
compression = Compression.GZIP
}
artifacts {
it.add("linux-tar", buildBwcTask)
it.add("${testArchiveProjectName}", buildBwcTask)
}

// expanded distro
configurations.create("expanded-linux-tar")
configurations.create("expanded-${testArchiveProjectName}")
def expandedTask = tasks.register("buildBwcExpandedTask", Copy) {
from('bwc-marker.txt')
into('build/install/elastic-distro')
}
artifacts {
it.add("expanded-linux-tar", file('build/install')) {
it.add("expanded-${testArchiveProjectName}", file('build/install')) {
builtBy expandedTask
type = 'directory'
}
Expand All @@ -160,9 +157,9 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest

private void localDistroSetup() {
settingsFile << """
include ":distribution:archives:linux-tar"
include ":distribution:archives:${testArchiveProjectName}"
"""
def bwcSubProjectFolder = testProjectDir.newFolder("distribution", "archives", "linux-tar")
def bwcSubProjectFolder = testProjectDir.newFolder("distribution", "archives", testArchiveProjectName)
new File(bwcSubProjectFolder, 'current-marker.txt') << "current"
new File(bwcSubProjectFolder, 'build.gradle') << """
import org.gradle.api.internal.artifacts.ArtifactAttributes;
Expand Down Expand Up @@ -190,10 +187,12 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest
it.add("extracted", buildExpanded)
}
"""
buildFile << """
"""
}

String getTestArchiveProjectName() {
def archSuffix = Architecture.current() == Architecture.AARCH64 ? '-aarch64' : ''
return "linux${archSuffix}-tar"
}
boolean assertExtractedDistroIsCreated(String relativeDistroPath, String markerFileName) {
File extractedFolder = new File(testProjectDir.root, relativeDistroPath)
assert extractedFolder.exists()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@
# in compliance with, at your election, the Elastic License 2.0 or the Server
# Side Public License, v 1.
#
ES_BUILD_JAVA=openjdk12
ES_RUNTIME_JAVA=openjdk12
ES_BUILD_JAVA=openjdk11
ES_RUNTIME_JAVA=openjdk11
GRADLE_TASK=build
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,14 @@

public enum Architecture {

X64,
AARCH64;
X64("x86_64"),
AARCH64("aarch64");

public final String classifier;

Architecture(String classifier) {
this.classifier = classifier;
}

public static Architecture current() {
final String architecture = System.getProperty("os.arch", "");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -166,14 +166,11 @@ private String dependencyNotation(ElasticsearchDistribution distribution) {

Version distroVersion = Version.fromString(distribution.getVersion());
String extension = distribution.getType().toString();
String classifier = ":" + (Architecture.current() == Architecture.AARCH64 ? "aarch64" : "x86_64");
String classifier = ":" + Architecture.current().classifier;
if (distribution.getType() == Type.ARCHIVE) {
extension = distribution.getPlatform() == Platform.WINDOWS ? "zip" : "tar.gz";
if (distroVersion.onOrAfter("7.0.0")) {
classifier = ":"
+ distribution.getPlatform()
+ "-"
+ (Architecture.current() == Architecture.AARCH64 ? "aarch64" : "x86_64");
classifier = ":" + distribution.getPlatform() + "-" + Architecture.current().classifier;
} else {
classifier = "";
}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
package org.elasticsearch.gradle;

import org.elasticsearch.gradle.test.GradleUnitTestCase;
import org.junit.Assume;
import org.junit.BeforeClass;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
Expand Down Expand Up @@ -279,6 +281,11 @@ public class BwcVersionsTests extends GradleUnitTestCase {
sampleVersions.put("7.1.0", asList("7_1_0", "7_0_0", "6_7_0", "6_6_1", "6_6_0"));
}

@BeforeClass
public static void setupAll() {
Assume.assumeFalse(Architecture.current() == Architecture.AARCH64);
}

@Test(expected = IllegalArgumentException.class)
public void testExceptionOnEmpty() {
new BwcVersions(asList("foo", "bar"), Version.fromString("7.0.0"));
Expand Down
4 changes: 4 additions & 0 deletions docs/reference/analysis/token-graphs.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,10 @@ record the `positionLength` for multi-position tokens. This filters include:
* <<analysis-synonym-graph-tokenfilter,`synonym_graph`>>
* <<analysis-word-delimiter-graph-tokenfilter,`word_delimiter_graph`>>

Some tokenizers, such as the
{plugin}/analysis-nori-tokenizer.html[`nori_tokenizer`], also accurately
decompose compound tokens into multi-position tokens.

In the following graph, `domain name system` and its synonym, `dns`, both have a
position of `0`. However, `dns` has a `positionLength` of `3`. Other tokens in
the graph have a default `positionLength` of `1`.
Expand Down
44 changes: 24 additions & 20 deletions docs/reference/query-dsl/simple-query-string-query.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -86,9 +86,10 @@ query string into tokens. Defaults to the
`default_field`. If no analyzer is mapped, the index's default analyzer is used.

`auto_generate_synonyms_phrase_query`::
(Optional, Boolean) If `true`, <<query-dsl-match-query-phrase,match phrase>>
queries are automatically created for multi-term synonyms. Defaults to `true`.
See <<simple-query-string-synonyms>> for an example.
(Optional, Boolean) If `true`, the parser creates a
<<query-dsl-match-query-phrase,`match_phrase`>> query for each
<<token-graphs-multi-position-tokens,multi-position token>>. Defaults to `true`.
For examples, see <<simple-query-string-synonyms>>.

`flags`::
(Optional, string) List of enabled operators for the
Expand Down Expand Up @@ -273,33 +274,36 @@ GET /_search
<1> The `subject` field is three times as important as the `message` field.

[[simple-query-string-synonyms]]
===== Synonyms
===== Multi-position tokens

The `simple_query_string` query supports multi-terms synonym expansion with the <<analysis-synonym-graph-tokenfilter,
synonym_graph>> token filter. When this filter is used, the parser creates a phrase query for each multi-terms synonyms.
For example, the following synonym: `"ny, new york"` would produce:
By default, the `simple_query_string` query parser creates a
<<query-dsl-match-query-phrase,`match_phrase`>> query for each
<<token-graphs-multi-position-tokens,multi-position token>> in the query string.
For example, the parser creates a `match_phrase` query for the multi-word
synonym `ny, new york`:

`(ny OR ("new york"))`

It is also possible to match multi terms synonyms with conjunctions instead:
To match multi-position tokens with an `AND` conjunction instead, set
`auto_generate_synonyms_phrase_query` to `false`:

[source,console]
--------------------------------------------------
----
GET /_search
{
"query": {
"simple_query_string" : {
"query" : "ny city",
"auto_generate_synonyms_phrase_query" : false
}
}
"query": {
"simple_query_string": {
"query": "ny city",
"auto_generate_synonyms_phrase_query": false
}
}
}
--------------------------------------------------
----

The example above creates a boolean query:
For the above example, the parser creates the following
<<query-dsl-bool-query,`bool`>> query:

`(ny OR (new AND york)) city)`

that matches documents with the term `ny` or the conjunction `new AND york`.
By default the parameter `auto_generate_synonyms_phrase_query` is set to `true`.

This `bool` query matches documents with the term `ny` or the conjunction
`new AND york`.
20 changes: 11 additions & 9 deletions docs/reference/sql/endpoints/translate.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -22,20 +22,22 @@ Which returns:
--------------------------------------------------
{
"size": 10,
"docvalue_fields": [
"_source": false,
"fields": [
{
"field": "author"
},
{
"field": "name"
},
{
"field": "page_count"
},
{
"field": "release_date",
"format": "epoch_millis"
}
],
"_source": {
"includes": [
"author",
"name",
"page_count"
],
"excludes": []
},
"sort": [
{
"page_count": {
Expand Down
Loading