-
Couldn't load subscription status.
- Fork 5.2k
Description
When the user opens a stream in Create mode, and passes it to a ZipArchive opened in Update mode, then attempts to add a very large file to the zip, an exception will be thrown on dispose stating Stream was too long, which is confusing for the user, and frustrating if the file was large and this happens so late.
The exception is thrown because we restrict the size of a file to Int32.MaxValue. The restriction is reached because the Update mode requires a stream that allows seeking, so we internally create a MemoryStream so that we can update the archive, but the size limit for the file is Int32.MaxValue.
We do have it documented here, but it's not very obvious for the user to look for answers in that location.
We should explore two improvements:
- Throw a more helpful error message when this combination of conditions meet (Stream in Create mode, ZipArchive in Update mode, file too large).
- Try to detect this case much earlier, instead of waiting it to happen on Dispose.
I also opened to suggest a Roslyn analyzer that would help users avoid falling into this: #35815
This has been reported a few times already:
- System.IO.Packaging.ZipPackage returns "Stream too long" exception with big files #42855
- System.IO.Packaging part stream has a memory leak when writing a large stream #23750
- System.IO.Compression: ZipArchive loads entire file in memory on .Dispose #1543
- System.IO.Compression: ZipArchiveEntry always stores uncompressed data in memory #1544
- Add support for Zip64 for archives > 4GB PowerShell/Microsoft.PowerShell.Archive#19
/cc @ericstj