Skip to content

Memory usage for large datasets #276

@guilleva

Description

@guilleva

Hi, I'm facing a memory issue when trying to generate a xlsx file with >600 columns and ~10k rows. The memory usage grows up to 3GB aprox and it's making the dyno be restarted by heroku.

I'm wondering if there is an known way to generate it without making a heavy use of memory.

The way I'm doing it can be represented by this script:

Axlsx::Package.new do |p|
  p.workbook.add_worksheet(:name => "Test") do |sheet|
    11_000.times do
        sheet.add_row ["test data"] * 600
    end
  end
  p.serialize('tmp/test.xlsx')
end

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions