Skip to content

json logging can log duplicate message fields #14335

@robbavey

Description

@robbavey

Description of the problem including expected versus actual behavior:

When using json logging, certain events are logged with two message entries:

{
  "level" : "WARN",
  "loggerName" : "logstash.codecs.jsonlines",
  "timeMillis" : 1657218530687,
  "thread" : "[main]<stdin",
  "logEvent" : {
    "message" : "JSON parse error, original data now in message field",
    "message" : "Unrecognized token 'asd': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')\n at [Source: (String)\"asd\"; line: 1, column: 4]",
    "exception" : {
      "metaClass" : {
        "metaClass" : {
          "exception" : "LogStash::Json::ParserError",
          "data" : "asd"
        }
      }
    }
  }
}

While this is technically valid json, in reality this will likely cause issues with consumers of the data leading to one of the message values being discarded or being flagged as invalid json.

Steps to reproduce:

This can be triggered by causing a log event of any logger invocation that includes a value for :message in a details hash/varargs, eg this warning entry in the json codec or this helper method in the elasticsearch output

Each of these values is placed directly in the logEvent method at the top level, leading to duplicates when :message is set.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions