-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Description
- Version: master
- Operating System: Linux
Starting logstash is encountering errors:
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to /tmp/logstash-8.0.0-SNAPSHOT/logs which is now configured via log4j2.properties
2019-10-15 15:28:02,288 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby, nashorn, Nashorn, js, JS, JavaScript, javascript, ECMAScript, ecmascript
2019-10-15 15:28:02,306 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby, nashorn, Nashorn, js, JS, JavaScript, javascript, ECMAScript, ecmascript
2019-10-15 15:28:02,311 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby, nashorn, Nashorn, js, JS, JavaScript, javascript, ECMAScript, ecmascript
[2019-10-15T15:28:02,818][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-10-15T15:28:02,845][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.0.0"}
[2019-10-15T15:28:04,747][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2019-10-15T15:28:04,749][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2019-10-15T15:28:05,816][INFO ][org.reflections.Reflections] Reflections took 40 ms to scan 1 urls, producing 20 keys and 40 values
[2019-10-15T15:28:06,773][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@10.0.2.15:9200/]}}
[2019-10-15T15:28:06,819][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@10.0.2.15:9200/"}
[2019-10-15T15:28:06,831][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>8}
[2019-10-15T15:28:06,832][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type
event field won't be used to determine the document _type {:es_version=>8}
[2019-10-15T15:28:06,892][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.0.2.15:9200"]}
[2019-10-15T15:28:06,982][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2019-10-15T15:28:07,038][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-10-15T15:28:07,041][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/tmp/logstash-8.0.0-SNAPSHOT/config/logstash.conf"], :thread=>"#<Thread:0x7163843a run>"}
[2019-10-15T15:28:07,077][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>80001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@Version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-10-15T15:28:07,447][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/tmp/logstash-8.0.0-SNAPSHOT/data/plugins/inputs/file/.sincedb_f5fdf6ea0ea92860c6a6b2b354bfcbbc", :path=>["/var/log/syslog"]}
[2019-10-15T15:28:07,516][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-10-15T15:28:07,622][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-10-15T15:28:07,639][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections
[2019-10-15T15:28:08,817][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch bulk_path=>"/_monitoring/bulk?system_id=logstash&system_api_version=7&interval=1s", ssl_certificate_verification=>false, password=>, hosts=>[https://10.0.2.15:9200], cacert=>"/tmp/logstash-8.0.0-SNAPSHOT/ca/ca.crt", sniffing=>false, manage_template=>false, id=>"d45100a788cbe2c3fe412cc8f68ac0e3505f79ea96fc58bfc20a69ca84dc9525", user=>"logstash_system", ssl=>true, document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_0bcdb207-4e88-4ca6-adf3-2f6d03df3df3", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-10-15T15:28:08,853][WARN ][logstash.outputs.elasticsearch][.monitoring-logstash] ** WARNING ** Detected UNSAFE options in elasticsearch output configuration!
** WARNING ** You have enabled encryption but DISABLED certificate verification.
** WARNING ** To make sure your data is secure change :ssl_certificate_verification to true
[2019-10-15T15:28:08,917][INFO ][logstash.outputs.elasticsearch][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_system:xxxxxx@10.0.2.15:9200/]}}
[2019-10-15T15:28:08,975][WARN ][logstash.outputs.elasticsearch][.monitoring-logstash] Restored connection to ES instance {:url=>"https://logstash_system:xxxxxx@10.0.2.15:9200/"}
[2019-10-15T15:28:08,989][INFO ][logstash.outputs.elasticsearch][.monitoring-logstash] ES Output version determined {:es_version=>8}
[2019-10-15T15:28:08,991][WARN ][logstash.outputs.elasticsearch][.monitoring-logstash] Detected a 6.x and above cluster: the type
event field won't be used to determine the document _type {:es_version=>8}
[2019-10-15T15:28:09,049][INFO ][logstash.outputs.elasticsearch][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.0.2.15:9200"]}
[2019-10-15T15:28:09,064][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x7e50c32e run>"}
[2019-10-15T15:28:09,155][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2019-10-15T15:28:09,180][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>[]}
[2019-10-15T15:28:09,692][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-10-15T15:28:09,835][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:09,840][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:11,881][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:11,890][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:15,915][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:15,919][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:23,938][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:23,940][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:39,953][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:28:39,957][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:29:11,967][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}
[2019-10-15T15:29:11,971][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}