Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: replace whole record with single field from it #53

Open
nhlushak opened this issue May 22, 2020 · 2 comments
Open

Question: replace whole record with single field from it #53

nhlushak opened this issue May 22, 2020 · 2 comments

Comments

@nhlushak
Copy link

I want to use this plugin to catch failed records from Elasticsearch output (e.g. "rejected by Elasticsearch") to put them into "dead-letter" output. Those failed records are emitted as fluent.warn records, with original message stored in "record" key of whole log message. What I want is to take this "record" key and move it down to fluentd pipeline as whole message itself with new tag. I did not find any documentation describing this neither for record_transformer plugin neither this one.
Example of that is wanted:
Original record:

2020-05-21 10:34:35.497925679 +0000 fluent.warn: 
{
    "error": "#<Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError: 400 - Rejected by Elasticsearch [error type]: mapper_parsing_exception [reason]: 'object mapping for [sample] tried to parse field [sample] as object, but found a concrete value'>",
    "location": null,
    "tag": "test.log",
    "time": 1589206011,
    "record": {
        "foo": "bar",
        "key": "value",
        "sample": [],
        "blah-blah": 133163771
    },
    "message": "dump an error event: error_class=Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError error="400 - Rejected by Elasticsearch [error type]: mapper_parsing_exception [reason]: 'object mapping for [sample] tried to parse field [sample] as object, but found a concrete value'" location=nil tag="test.log" time=1589206011 record={\"foo\"=>\"bar\", \"key\"=>\"value\", \"sample\"=>[], \"blah-blah\"=>133163771}"
}

Modified record:

2020-05-21 10:34:36.497925679 +0000 dead.log: 
{
    "foo": "bar",
    "key": "value",
    "sample": [],
    "blah-blah": 133163771
}
@agung-kargo
Copy link

Also has similar question with you @NikitaGl
https://stackoverflow.com/questions/64155725/what-is-the-behavior-of-fluentd-when-it-gets-parser-error

Have you found the answer?

@nhlushak
Copy link
Author

nhlushak commented Oct 1, 2020

@agung-kargo
Yes, sort of.
I figured this out with format plugin in output section:

<match fluent.warn>
  @type rewrite_tag_filter
  @id rewrite_tag_dead
  <rule>
    key tag
    pattern /^(logmessage.*)$/
    tag log_out.dead
  </rule>
</match>

<match log_out.dead>
  @type file
  @id out_file_dead
  path /data/output/log_rejected/log-%Y%m%d
  <buffer>
    @type file
    path /data/buffer/file/log_dead/
    flush_mode interval
    retry_type exponential_backoff
    flush_thread_count 1
    flush_interval 15s
    retry_forever true
    retry_max_interval 3m
    chunk_limit_size 10M
    total_limit_size 5G
    overflow_action block
    flush_at_shutdown true   
  </buffer>
  <format>
    # format section parameters
    @type single_value
    add_newline true
    message_key record
  </format>   
</match>

record section contains original message line that caused fluent to throw an error. The only drawback of such method is that format type single_value is supported only for file output.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants