Description
Bevy version
git/main
Operating system & version
Win10 latest
What you did
Place a mesh far from the origin. Move the camera to the mesh.
What you expected to happen
Instead of relative mesh vertex jitter, I would expect the mesh position to be inaccurate, but all vertices to have the same amount of inaccuracy due to precision loss in the entity's large Transform
.
What actually happened
Currently mesh vertices all jitter relative to one another when a mesh is placed far from the origin, despite mesh vertex data being stored in a more precise local space.
Additional information
Working with meshes far from the origin, there is an expected precision loss. However, the way this is manifested in Bevy can be improved. This is what we currently do ("large" and "small" refer to the magnitude of the f32s):
GlobalTransform
(large) -> pbr.vert
-> Apply view
(large) and proj
(small) matrix to each vertex (small)
This results in a low-precision multiply (view * proj) that produces a different amount of error for each vertex, because the low-precision multiplication is happening for each vertex. A potential solution is:
GlobalTransform
(large) -> ViewTransform
(small) -> pbr.vert
-> Apply proj
(small) matrix to each vertex (small)
While this doesn't fix the inherent precision loss from the large GlobalTransform
to the (new) small ViewTransform
, the error is generated one time per mesh, instead of one time for every vertex of the mesh. This should result in the expected behavior: moving the camera to a mesh far from the origin will show a mesh in an imprecise location, but all vertices of this mesh are precisely positioned relative to each other. This will have the practical benefit of increasing the usable size of the world, as mesh positioning precision is much less egregious than the same amount of imprecision applied randomly to each vertex of a mesh.
Adding a ViewTransform
would also give us the ability to directly place things relative to the camera, which could be quite handy.
Activity