Skip to content

Exponential growth in precision causing memory usage to spike as of 3.1.2 #222

Closed
@smidwap

Description

@smidwap

I upgraded to BigDecimal 3.1.2 to pull in the fix to #220, which was causing crashes in our Rails app. However, I believe the fix applied in 127a1b5 created the potential for exponential growth in the number of digits a BigDecimal can have. The result is that certain calculations can eat up all available memory (several gigs) and crash the process.

Take the following as an example, which outputs the number of digits from the calculation:

Array.new(10) { BigDecimal("1.03") }.inject(1.0) { |result, num| result / num }.to_digits.length

In 3.1.1, this calculation produces 119 digits.

In 3.1.2, this calculation produces 23,033 digits.

Moreover, if Array.new(10) is grown to Array.new(20), 3.1.1 produces 209 digits, whereas 3.1.2 produces 23,592,953.

The runaway growth in digits and resulting memory spike could be solved by rounding the result in inject's block like so:

Array.new(10) { BigDecimal("1.03") }.inject(1.0) { |result, num| (result / num).round(3) }.to_digits.length

But others may run into this same surprise. We've also run into PG::NumericValueOutOfRange exceptions from postgres since upgrading to 3.1.2, and my hunch is the same root issue is at play.

Let me know if I can provide any more details.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions