Description
Hi everyone! I am facing a problem with logstash and elasticsearch output plugin. Basically i want to index couchdb changes. Using this config:
input {
couchdb_changes {
db => "database"
host => "localhost"
port => 5984
username => "username"
password => "password"
#initial_sequence => 0 #this is only required for the an initial indexing
}
}
filter {
mutate {
add_field => { "action" => "%{[@metadata][action]}" }
}
if [action] == 'delete' {
elasticsearch {
hosts => ["localhost:9200"]
query => "_id:%{[@metadata][_id]}"
fields => ["type", "$doctype"]
sort => ""
}
} else {
mutate {
add_field => { "type" => "%{[doc][$doctype]}" } #yes, my docs have a $doctype field to store the type
}
}
}
output {
elasticsearch {
action => "%{[@metadata][action]}"
doc_as_upsert => true
document_id => "%{[@metadata][_id]}"
#document_type => "%{[@metadata][$doctype]}" # it wont work, why?
hosts => ["localhost:9200"]
index => "my_index"
}
stdout { codec => rubydebug } #enable this option for debugging purpose
}
After creating/updating a couchdb document, the whole reponse of logstash, is:
`←[33mFailed action. {:status=>404, :action=>["update", {:_id=>"7772e161f5b0e2b1
abced1679f00045c", :_index=>"kmfile", :_type=>"kmFile", :_routing=>nil}, #<LogSt
ash::Event:0xd847ac @metadata_accessors=#<LogStash::Util::Accessors:0x6a28cc @st
ore={"_id"=>"7772e161f5b0e2b1abced1679f00045c", "action"=>"update", "seq"=>106},
@lut={"[action]"=>[{"_id"=>"7772e161f5b0e2b1abced1679f00045c", "action"=>"updat
e", "seq"=>106}, "action"], "[_id]"=>[{"_id"=>"7772e161f5b0e2b1abced1679f00045c"
, "action"=>"update", "seq"=>106}, "_id"]}>, @cancelled=false, @DaTa={"doc"=>{"$
doctype"=>"kmFile"}, "doc_as_upsert"=>true, "@Version"=>"1", "@timestamp"=>"2016
-02-26T15:37:06.311Z", "action"=>"update", "type"=>"kmFile"}, @metadata={"_id"=>
"7772e161f5b0e2b1abced1679f00045c", "action"=>"update", "seq"=>106}, @accessors=
<LogStash::Util::Accessors:0x291625 @store={"doc"=>{"$doctype"=>"kmFile"}, "doc
_as_upsert"=>true, "@Version"=>"1", "@timestamp"=>"2016-02-26T15:37:06.311Z", "a
ction"=>"update", "type"=>"kmFile"}, @lut={"action"=>[{"doc"=>{"$doctype"=>"kmFi
le"}, "doc_as_upsert"=>true, "@Version"=>"1", "@timestamp"=>"2016-02-26T15:37:06
.311Z", "action"=>"update", "type"=>"kmFile"}, "action"], "[action]"=>[{"doc"=>{
"$doctype"=>"kmFile"}, "doc_as_upsert"=>true, "@Version"=>"1", "@timestamp"=>"20
16-02-26T15:37:06.311Z", "action"=>"update", "type"=>"kmFile"}, "action"], "[doc
][$doctype]"=>[{"$doctype"=>"kmFile"}, "$doctype"], "type"=>[{"doc"=>{"$doctype"
=>"kmFile"}, "doc_as_upsert"=>true, "@Version"=>"1", "@timestamp"=>"2016-02-26T1
5:37:06.311Z", "action"=>"update", "type"=>"kmFile"}, "type"]}>>], :response=>{"
update"=>{"_index"=>"kmfile", "_type"=>"kmFile", "_id"=>"7772e161f5b0e2b1abced16
79f00045c", "status"=>404, "error"=>{"type"=>"document_missing_exception", "reas
on"=>"[kmFile][7772e161f5b0e2b1abced1679f00045c]: document missing", "shard"=>"-
1", "index"=>"kmfile"}}}, :level=>:warn}←[0m`
$doctype with the '$' was the first document type field name that i used when i created the database, and i have not changed it since then. I have changed it to "type" or other name with no "$", but no luck.
As theuntergeek points:
Elasticsearch is complaining because no document with _id 7772e161f5b0e2b1abced1679f00045c exists. This implies that doc_as_upsert isn't working properly.
Simple logstash configuration, with elasticsearch output, works ok:
input { stdin { } } filter { grok { match => { "message" => "%{COMBINEDAPACHELOG}" } } date { match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] } } output { elasticsearch { hosts => ["localhost:9200"] } stdout { codec => rubydebug } }