Skip to content

Possible bug: f128 constant only has 16 decimals of precision when printed #3612

Closed
@bdezonia

Description

@bdezonia

Using 0.5.0 the following code:

var z : f128 = 1.738954882048422195749048390299;
warn("{:.34}\n",z);

Shows this:

1.7389548820484222000000000000000000e+00

It's like the compile time constant is 64 bit rather than 128 bit. The documentation implies that floating constants are 128 bit.

Is this a bug? Or am I doing something wrong?

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugObserved behavior contradicts documented or intended behaviorcontributor friendlyThis issue is limited in scope and/or knowledge of Zig internals.standard libraryThis issue involves writing Zig code for the standard library.

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions