Skip to content

Filebeat OOMs on very long lines #19500

Closed
@benbuzbee

Description

  • Version: 7.8.0
  • Operating System: Ubuntu 18.04 LTS

Repro

  1. Create this filebeat.yml in a new directory
filebeat.inputs:
  - type: log
    paths:
    - /test/readme.log
    max_bytes: 1024
output.console:
  pretty: true

Note: The config specifies a 1MB limit on a single file

Run this bash command:

dd if=/dev/zero of=$(pwd)/readme.log bs=1MB count=100 && docker run -d --memory=50MB --memory-swappiness=0 -v $(pwd):$(pwd) -v $(pwd)/readme.log:/test/readme.log docker.elastic.co/beats/filebeat:7.8.0 filebeat --path.config $(pwd) -e

Note: This command creates a 100MB file called "readme.log" and mounts it to filebeat as the /test/readme.log file. It sets a 50MB limit in the docker container.

Expected: command runs file, filebeat reads 1MB of data from readme.log before realizing the line is too long, and then seeks without buffering the rest of the line in memory.

Actual: filebeat reads all 100MB of the file into memory searching for a newline and OOMs

$ docker inspect 0945c | grep "OOM"
            "OOMKilled": true,

This is pretty bad for us, if we make a mistake and log a huge line filebeat gets OOM killed. Filebeat should stop buffering a line once it exceeds max_bytes

Metadata

Assignees

Labels

Team:Services(Deprecated) Label for the former Integrations-Services team

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions