Describe the bug
Int96ArrayConverter takes the INT96 data and converts it to an i64 number of milliseconds since epoch, it then multiplies this by 1_000_000 to convert to a i64 number of nanoseconds since epoch. This loses the nanosecond precision contained within the INT96 data
To Reproduce
Expected behavior
Additional context
Noticed whilst playing with #1661