You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@DruSchmitt
Hi, we have encountered same problem.
If compressions is done using stream approach and you try decompressing using Unwrap method you get exception:
ZstdNet.ZstdException: Decompressed content size cannot be determined (e.g. invalid magic number, srcSize too small)
at ZstdNet.Decompressor.GetDecompressedSize(ReadOnlySpan1 src) at ZstdNet.Decompressor.Unwrap(ReadOnlySpan1 src, Int32 maxDecompressedSize)
at ZstdNet.Decompressor.Unwrap(ArraySegment`1 src, Int32 maxDecompressedSize)
at ZstdNet.Decompressor.Unwrap(Byte[] src, Int32 maxDecompressedSize)
In our case this caused some pain, as we where using Snowflake DECOMPRESS_STRING feature, it supports ZSDT mode, but was failing with an error.
It took us some time to figure out that Stream and Non stream approaches are not compatible.
Is there workaround to this, as it looks like it is missing some header information in output stream, do we need to provide content size information to stream manually?
We tried adding content length bytes similar to how you do in gzip stream, but it did not help.
The text was updated successfully, but these errors were encountered: