Skip to content

Add deprecation info check for monitoring exporter password #73742

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
aab1ace
Add deprecation info check for monitoring exporter password
danhermann Jun 3, 2021
3750341
unused imports
danhermann Jun 3, 2021
dae7f54
Merge branch '7.x' into 7x_deprecation_info_for_auth_password
elasticmachine Jun 3, 2021
8076e4b
make it actually work with affix settings
danhermann Jun 3, 2021
7a155b8
fix test
danhermann Jun 4, 2021
3cba6f4
Merge branch '7.x' into 7x_deprecation_info_for_auth_password
danhermann Jun 11, 2021
b64ede5
Merge branch '7.x' into 7x_deprecation_info_for_auth_password
elasticmachine Jun 11, 2021
17ae565
Speed up Slow FlushIT (#74013) (#74024)
original-brownbear Jun 13, 2021
67610e8
Remove S3 Eventual Consistency Related Tests (#74015) (#74043)
original-brownbear Jun 13, 2021
094eb11
Fix MultiVersionRepositoryAccessIT to Account for Plugins (#71809) (#…
original-brownbear Jun 13, 2021
b2c17c6
Simplify Blobstore Consistency Check in Tests (#73992) (#74045)
original-brownbear Jun 13, 2021
92abe73
Speed up Maps.copyMapWithAddedEntry to Speed up ITs (#73308) (#74046)
original-brownbear Jun 13, 2021
556dbee
Make Parsing SnapshotInfo more Efficient (#74005) (#74047)
original-brownbear Jun 13, 2021
a9b782c
Dry up HTTP Smoke Tests around Snapshots (#73962) (#74048)
original-brownbear Jun 13, 2021
72fac01
Refactor RestoreService Restore Path (#73258) (#74049)
original-brownbear Jun 14, 2021
58d7b8d
Deserialize BlobStore Metadata Files in a Streaming Manner (#73149) (…
original-brownbear Jun 14, 2021
0989f88
Log at DEBUG only on disconnect during cancellation (#74042)
DaveCTurner Jun 14, 2021
8418e0c
Move get-aliases handling onto management thread (#74053)
DaveCTurner Jun 14, 2021
5690fd3
Fix o.e.c.network split package in xpack core (#73973)
DaveCTurner Jun 14, 2021
5d14708
Adjust get alias api for write data streams (#74055)
martijnvg Jun 14, 2021
70c9b74
RestToXContentListener extends RestBuilderListener (#74054)
DaveCTurner Jun 14, 2021
5483bf2
Allow some Repository Settings to be Updated Dynamically (#72543) (#7…
original-brownbear Jun 14, 2021
d697f01
Fix up o.e.snapshots split package in xpack-core (#74056)
DaveCTurner Jun 14, 2021
468b15e
[ML][HLRC] adds new running_state field to datafeed stats (#73926) (#…
benwtrent Jun 14, 2021
0bf1d79
Fix up frozen indices split package (#74066)
DaveCTurner Jun 14, 2021
9de57b7
Fix highlighting for match_phrase_prefix query inside nested (#73775)…
Jun 14, 2021
91a9bb7
Fix Broken MultiVersionRepositoryAccessIT (#74074)
original-brownbear Jun 14, 2021
44b537e
PR 73058 contains a bug when merging deeply-nested mappers which
romseygeek Jun 14, 2021
7468045
[DOCS] Add missing comma (#73577) (#74077)
jrodewig Jun 14, 2021
10c5c06
[DOCS] Add Swift client to community clients (#74075) (#74079)
jrodewig Jun 14, 2021
b2feedf
[Docs] Update cross-document links to Kibana Alerting docs (#74034) (…
ymao1 Jun 14, 2021
9d90c7d
[ML] renamed DatafeedManager to DatafeedRunner (#74082) (#74090)
benwtrent Jun 14, 2021
f61cda2
Release notes for v7.13.2 (#74067) (#74097)
jrodewig Jun 14, 2021
f79102e
Bump version after 7.13.2 release
danhermann Jun 14, 2021
b488837
[DOCS] Remove 7.13.2 coming tag (#74030) (#74098)
jrodewig Jun 14, 2021
6c3e5a0
[DOCS] Fix typo in CCR connect example (#74100) (#74102)
jrodewig Jun 14, 2021
a84190c
Upgrade commons-math3 library and fix license and notice files (#7403…
imotov Jun 14, 2021
a45598a
ApiKeyAuthCache now expires after access instead of write (#73982) (#…
ywangd Jun 15, 2021
92f7c62
[7.x][ML] Reset anomaly detection job API (#73908) (#74093)
dimitris-athanasiou Jun 15, 2021
92634cb
Tidy up deprecation code. (#74095)
martijnvg Jun 15, 2021
4ed2ec9
Add a test for parsing doc with a flattened field (#74112)
javanna Jun 15, 2021
59a64a0
Deprecate realm names with a leading underscore (#73366) (#73319)
ywangd Jun 15, 2021
1f2d997
[DOCS] Clarify criteria for restore completion (#74094) (#74123)
jrodewig Jun 15, 2021
56aeb05
Recycle buffers used for file-based recovery (#74117)
DaveCTurner Jun 15, 2021
917154b
[DOCS] Note ESS must use custom bundles for custom GeoIP database fil…
jrodewig Jun 15, 2021
09fa6fa
[DOCS] Service account edits (#73732) (#74127)
Jun 15, 2021
4aa4c87
Mute MlDistributedFailureIT failing test (#74133) (#74134)
not-napoleon Jun 15, 2021
af4813b
[7.x][Transform] optmize histogam group_by change detection (#74031) …
Jun 15, 2021
16d2142
[DOCS] Change `multi field` to `multi-field`
jrodewig Jun 15, 2021
1de7a4a
[7.x] Fix mapping error to indicate values field (#74132) (#74137)
benwtrent Jun 15, 2021
2e585a8
Fix bug when formatting epoch dates (#73955) (#74139)
not-napoleon Jun 15, 2021
6ee9e57
Merge branch '7x_deprecation_info_for_auth_password' of https://githu…
danhermann Jun 15, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .ci/bwcVersions
Original file line number Diff line number Diff line change
Expand Up @@ -84,3 +84,4 @@ BWC_VERSION:
- "7.13.0"
- "7.13.1"
- "7.13.2"
- "7.13.3"
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ public List<DeprecationIssue> getMlSettingsIssues() {

private static List<DeprecationIssue> parseDeprecationIssues(XContentParser parser) throws IOException {
List<DeprecationIssue> issues = new ArrayList<>();
XContentParser.Token token = null;
XContentParser.Token token;
while ((token = parser.nextToken()) != XContentParser.Token.END_ARRAY) {
if (token == XContentParser.Token.START_OBJECT) {
issues.add(DeprecationIssue.PARSER.parse(parser, null));
Expand Down Expand Up @@ -116,8 +116,7 @@ public int hashCode() {

@Override
public String toString() {
return clusterSettingsIssues.toString() + ":" + nodeSettingsIssues.toString() + ":" + indexSettingsIssues.toString() +
":" + mlSettingsIssues.toString();
return clusterSettingsIssues + ":" + nodeSettingsIssues + ":" + indexSettingsIssues + ":" + mlSettingsIssues;
}

/**
Expand Down Expand Up @@ -156,10 +155,10 @@ public String toString() {
}
}

private Level level;
private String message;
private String url;
private String details;
private final Level level;
private final String message;
private final String url;
private final String details;

public DeprecationIssue(Level level, String message, String url, @Nullable String details) {
this.level = level;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,13 @@ public class DatafeedStats implements ToXContentObject {
private final String assignmentExplanation;
@Nullable
private final DatafeedTimingStats timingStats;
@Nullable
private final RunningState runningState;

public static final ParseField ASSIGNMENT_EXPLANATION = new ParseField("assignment_explanation");
public static final ParseField NODE = new ParseField("node");
public static final ParseField TIMING_STATS = new ParseField("timing_stats");
public static final ParseField RUNNING_STATE = new ParseField("running_state");

public static final ConstructingObjectParser<DatafeedStats, Void> PARSER = new ConstructingObjectParser<>("datafeed_stats",
true,
Expand All @@ -45,7 +48,8 @@ public class DatafeedStats implements ToXContentObject {
NodeAttributes nodeAttributes = (NodeAttributes)a[2];
String assignmentExplanation = (String)a[3];
DatafeedTimingStats timingStats = (DatafeedTimingStats)a[4];
return new DatafeedStats(datafeedId, datafeedState, nodeAttributes, assignmentExplanation, timingStats);
RunningState runningState = (RunningState) a[5];
return new DatafeedStats(datafeedId, datafeedState, nodeAttributes, assignmentExplanation, timingStats, runningState);
} );

static {
Expand All @@ -54,15 +58,21 @@ public class DatafeedStats implements ToXContentObject {
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), NodeAttributes.PARSER, NODE);
PARSER.declareString(ConstructingObjectParser.optionalConstructorArg(), ASSIGNMENT_EXPLANATION);
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), DatafeedTimingStats.PARSER, TIMING_STATS);
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), RunningState.PARSER, RUNNING_STATE);
}

public DatafeedStats(String datafeedId, DatafeedState datafeedState, @Nullable NodeAttributes node,
@Nullable String assignmentExplanation, @Nullable DatafeedTimingStats timingStats) {
public DatafeedStats(String datafeedId,
DatafeedState datafeedState,
@Nullable NodeAttributes node,
@Nullable String assignmentExplanation,
@Nullable DatafeedTimingStats timingStats,
@Nullable RunningState runningState) {
this.datafeedId = Objects.requireNonNull(datafeedId);
this.datafeedState = Objects.requireNonNull(datafeedState);
this.node = node;
this.assignmentExplanation = assignmentExplanation;
this.timingStats = timingStats;
this.runningState = runningState;
}

public String getDatafeedId() {
Expand All @@ -85,6 +95,10 @@ public DatafeedTimingStats getDatafeedTimingStats() {
return timingStats;
}

public RunningState getRunningState() {
return runningState;
}

@Override
public XContentBuilder toXContent(XContentBuilder builder, ToXContent.Params params) throws IOException {
builder.startObject();
Expand Down Expand Up @@ -112,13 +126,16 @@ public XContentBuilder toXContent(XContentBuilder builder, ToXContent.Params par
if (timingStats != null) {
builder.field(TIMING_STATS.getPreferredName(), timingStats);
}
if (runningState != null) {
builder.field(RUNNING_STATE.getPreferredName(), runningState);
}
builder.endObject();
return builder;
}

@Override
public int hashCode() {
return Objects.hash(datafeedId, datafeedState.toString(), node, assignmentExplanation, timingStats);
return Objects.hash(datafeedId, datafeedState.toString(), node, assignmentExplanation, timingStats, runningState);
}

@Override
Expand All @@ -134,6 +151,7 @@ public boolean equals(Object obj) {
Objects.equals(this.datafeedState, other.datafeedState) &&
Objects.equals(this.node, other.node) &&
Objects.equals(this.assignmentExplanation, other.assignmentExplanation) &&
Objects.equals(this.runningState, other.runningState) &&
Objects.equals(this.timingStats, other.timingStats);
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/

package org.elasticsearch.client.ml.datafeed;

import org.elasticsearch.common.xcontent.ConstructingObjectParser;
import org.elasticsearch.common.xcontent.ParseField;
import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder;

import java.io.IOException;
import java.util.Objects;

public class RunningState implements ToXContentObject {

private static final ParseField REAL_TIME_CONFIGURED = new ParseField("real_time_configured");
private static final ParseField REAL_TIME_RUNNING = new ParseField("real_time_running");

public static final ConstructingObjectParser<RunningState, Void> PARSER = new ConstructingObjectParser<>(
"datafeed_running_state",
true,
a -> new RunningState((Boolean)a[0], (Boolean)a[1])
);

static {
PARSER.declareBoolean(ConstructingObjectParser.constructorArg(), REAL_TIME_CONFIGURED);
PARSER.declareBoolean(ConstructingObjectParser.constructorArg(), REAL_TIME_RUNNING);
}

// Is the datafeed a "realtime" datafeed, meaning it was started without an end_time
private final boolean realTimeConfigured;
// Has the reading historical data has finished and are we now running on "real-time" data
private final boolean realTimeRunning;

public RunningState(boolean realTimeConfigured, boolean realTimeRunning) {
this.realTimeConfigured = realTimeConfigured;
this.realTimeRunning = realTimeRunning;
}

/**
* Indicates if the datafeed is configured to run in real time
*
* @return true if the datafeed is configured to run in real time.
*/
public boolean isRealTimeConfigured() {
return realTimeConfigured;
}

/**
* Indicates if the datafeed has processed all historical data available at the start time and is now processing "real-time" data.
* @return true if the datafeed is now running in real-time
*/
public boolean isRealTimeRunning() {
return realTimeRunning;
}

@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
RunningState that = (RunningState) o;
return realTimeConfigured == that.realTimeConfigured && realTimeRunning == that.realTimeRunning;
}

@Override
public int hashCode() {
return Objects.hash(realTimeConfigured, realTimeRunning);
}

@Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
builder.startObject();
builder.field(REAL_TIME_CONFIGURED.getPreferredName(), realTimeConfigured);
builder.field(REAL_TIME_RUNNING.getPreferredName(), realTimeRunning);
builder.endObject();
return builder;
}


}
Original file line number Diff line number Diff line change
Expand Up @@ -814,6 +814,8 @@ public void testGetDatafeedStats() throws Exception {
assertThat(response.datafeedStats(), hasSize(1));
assertThat(response.datafeedStats().get(0).getDatafeedId(), equalTo(datafeedId1));
assertThat(response.datafeedStats().get(0).getDatafeedState().toString(), equalTo(DatafeedState.STARTED.toString()));
assertThat(response.datafeedStats().get(0).getRunningState(), is(notNullValue()));
assertThat(response.datafeedStats().get(0).getRunningState().isRealTimeConfigured(), is(true));

// Test getting all explicitly
request = GetDatafeedStatsRequest.getAllDatafeedStatsRequest();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,14 @@ public static DatafeedStats createRandomInstance() {
}
String assignmentReason = randomBoolean() ? randomAlphaOfLength(10) : null;
DatafeedTimingStats timingStats = DatafeedTimingStatsTests.createRandomInstance();
return new DatafeedStats(datafeedId, datafeedState, nodeAttributes, assignmentReason, timingStats);
return new DatafeedStats(
datafeedId,
datafeedState,
nodeAttributes,
assignmentReason,
timingStats,
randomBoolean() ? null : RunningStateTests.createRandomInstance()
);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
package org.elasticsearch.client.ml.datafeed;

import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.test.AbstractXContentTestCase;

import java.io.IOException;


public class RunningStateTests extends AbstractXContentTestCase<RunningState> {

public static RunningState createRandomInstance() {
return new RunningState(randomBoolean(), randomBoolean());
}

@Override
protected RunningState createTestInstance() {
return createRandomInstance();
}

@Override
protected RunningState doParseInstance(XContentParser parser) throws IOException {
return RunningState.PARSER.apply(parser, null);
}

@Override
protected boolean supportsUnknownFields() {
return true;
}
}
6 changes: 6 additions & 0 deletions docs/community-clients/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ a number of clients that have been contributed by the community for various lang
* <<rust>>
* <<scala>>
* <<smalltalk>>
* <<swift>>
* <<vertx>>

[[b4j]]
Expand Down Expand Up @@ -236,6 +237,11 @@ client].

* https://github.com/newapplesho/elasticsearch-smalltalk[elasticsearch-smalltalk]:
Pharo Smalltalk client for Elasticsearch.

[[swift]]
== Swift
* https://github.com/brokenhandsio/elasticsearch-nio-client[Elasticsearch NIO Client]: a library for
working with Elasticsearch in Swift, built on top of SwiftNIO and Swift Package Manager.

[[vertx]]
== Vert.x
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/ccr/getting-started.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ image::images/ccr-tutorial-clusters.png[ClusterA contains the leader index and C
To configure a remote cluster from Stack Management in {kib}:

. Select *Remote Clusters* from the side navigation.
. Specify the IP address or host name of the remote cluster (ClusterB),
. Specify the IP address or host name of the remote cluster (`ClusterA`),
followed by the transport port of the remote cluster (defaults to `9300`). For
example, `192.168.1.1:9300`.

Expand Down
8 changes: 6 additions & 2 deletions docs/reference/ingest/processors/geoip.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,12 @@ The `ingest-geoip` module ships by default with the GeoLite2 City, GeoLite2 Coun
under the CCA-ShareAlike 4.0 license. For more details see, http://dev.maxmind.com/geoip/geoip2/geolite2/

The `geoip` processor can run with other city, country and ASN GeoIP2 databases
from Maxmind. The database files must be copied into the `ingest-geoip` config
directory located at `$ES_CONFIG/ingest-geoip`. Custom database files must be
from Maxmind. On {ess} deployments, custom database files must be uploaded using
a {cloud}/ec-custom-bundles.html[custom bundle]. On self-managed deployments,
custom database files must be copied into the `ingest-geoip` config
directory located at `$ES_CONFIG/ingest-geoip`.

Custom database files must be
stored uncompressed and the extension must be `-City.mmdb`, `-Country.mmdb`, or
`-ASN.mmdb` to indicate the type of the database. The
`database_file` processor option is used to specify the filename of the custom
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/ingest/processors/set.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ include::common-options.asciidoc[]
[source,js]
--------------------------------------------------
{
"description" : "sets the value of count to 1"
"description" : "sets the value of count to 1",
"set": {
"field": "count",
"value": 1
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/mapping/dynamic/templates.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ PUT my-index-000001/_doc/1
--------------------------------------------------

<1> The `my_integer` field is mapped as an `integer`.
<2> The `my_string` field is mapped as a `text`, with a `keyword` <<multi-fields,multi field>>.
<2> The `my_string` field is mapped as a `text`, with a `keyword` <<multi-fields,multi-field>>.

[[match-unmatch]]
==== `match` and `unmatch`
Expand Down
35 changes: 35 additions & 0 deletions docs/reference/migration/migrate_7_14.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -83,3 +83,38 @@ has no effect and you should discontinue its use.
====

// end::notable-breaking-changes[]

[discrete]
[[deprecated-7.14]]
=== Deprecations

The following functionality has been deprecated in {es} 7.14
and will be removed in 8.0.
While this won't have an immediate impact on your applications,
we strongly encourage you take the described steps to update your code
after upgrading to 7.14.

NOTE: Significant changes in behavior are deprecated in a minor release and
the old behavior is supported until the next major release.
To find out if you are using any deprecated functionality,
enable <<deprecation-logging, deprecation logging>>.

[discrete]
[[breaking_714_security_changes]]
==== Security deprecations

[[reserved-prefixed-realm-names]]
.Configuring a realm name with a leading underscore is deprecated.
[%collapsible]
====
*Details* +
Elasticsearch creates "synthetic" realm names on the fly for services like API keys.
These synthetic realm names are prefixed with an underscore.
Currently, user configured realms can also be given a name with a leading underscore.
This creates confusion since realm names are meant to be unique for a node.

*Impact* +
Configuring a realm name with a leading underscore is deprecated. In a future release of {es}
it will result in an error on startup if any user configured realm has a name
with a leading underscore.
====
18 changes: 18 additions & 0 deletions docs/reference/ml/anomaly-detection/apis/get-job.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,24 @@ include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=exclude-generated]
The API returns an array of {anomaly-job} resources. For the full list of
properties, see <<ml-put-job-request-body,create {anomaly-jobs} API>>.

//Begin blocked
`blocked`::
(object) When present, it explains that a task is executed on the job
that blocks it from opening.
+
.Properties of `blocked`
[%collapsible%open]
====
`reason`:::
(string) The reason the job is blocked. Values may be `delete`, `reset`, `revert`.
Each value means the corresponding action is being executed.

`task_id`:::
(string) The task id of the blocking action. You can use the <<tasks>> API to
monitor progress.
====
//End blocked

`create_time`::
(string) The time the job was created. For example, `1491007356077`. This
property is informational; you cannot change its value.
Expand Down
Loading