-
Hello all, I have been using PHP to generate a streamed zip-archive back-up of a nested folder tree. When I run this code locally (on my development machine), all is well and the complete folder (approx. 2,4GB) is zipped and streamed to the browser. When I run that same code (with the same data folder and database records) on the shared hosting webspace, the stream gets aborted after approx. 600MB. The browser shows HTTP-status 200, but the streamed ZIP-file is damaged/corrupt, incomplete. This is the current situation:
I have tried to narrow down the problem the following ways:
however
Can anyone help me solve this issue? How can I debug the problem any further, when the webserver returns HTTP 200 but still serves an incomplete (aborted) stream? I have asked the hosting company for help, but they say they don't support custom-built scripts. I'd have to run a (virtual) private server to get full control of the webserver. Thanks, kind regards, |
Beta Was this translation helpful? Give feedback.
Replies: 7 comments 12 replies
-
Can you check if there’s an error string at the end of the zip file? Since the file is streamed, the status is set to 200 at the start of the stream and it can’t be changed later even if an error occurs. |
Beta Was this translation helpful? Give feedback.
-
One more frown upon my face... I created 1200 dummy files of 2MB each, and put them in the for i in $( seq 1 1 1200 ); do
dd if=/dev/zero of="$( printf '%04d' ${i} ).dummy" bs=2M count=1;
done I then created a simple PHP-script in the hosting's root folder: <?php
require 'vendor/autoload.php';
$zip = new ZipStream\ZipStream(
outputName: 'example.zip',
sendHttpHeaders: true,
);
$zip->addFile(
fileName: 'hello.txt',
data: 'This is the contents of hello.txt',
);
for ($i = 1; $i <= 1200; $i++) {
$iformatted = sprintf("%04d.dummy", $i);
$path = __DIR__.'/dummy/'.$iformatted;
$zip->addFileFromPath(
fileName: $iformatted,
path: $path
);
}
$zip->finish(); This results in a perfectly streamed zipfile (2,6MB in size because of all the zeros), including 1200 2MB dummy files and What we learn: it isn't the amount or total size of uncompressed files that causes the issue on this shared hosting. I will investigate further and report back. |
Beta Was this translation helpful? Give feedback.
-
I have retried this step, and it fails - every time. Now, the 'magic' threshold is around 750MB. I uploaded a large file (Fedora Workstation installation ISO, 2.4GB) to the hosting's root folder. Then commented out the $zip->addFileFromPath(fileName:'f.iso', path:__DIR__.'/f.iso'); Still, the webserver logs are empty and the zip stream is not complete. Will experiment with different compression settings (ZipStream constructor parameters). |
Beta Was this translation helpful? Give feedback.
-
Found it! When I either set Both the dummy-script and my ArchiveExport script now work as expected. For me, the issue is solved. But I remain curious as to why on this particular shared hosting, Thanks for your wonderful work, reading and thinking along in solving this! |
Beta Was this translation helpful? Give feedback.
-
Perhaps it's related to the different On
On the shared hosting:
|
Beta Was this translation helpful? Give feedback.
-
Sadly, this is a shared hosting environment. I have no control or any insight into the infrastructure 'behind the scenes'. I guess, we'll never know exactly where the request gets killed; at least not without the hosting company's support. They write:
I'm sorry. Kind regards, |
Beta Was this translation helpful? Give feedback.
-
Good point... the project using ZipStream-PHP is not actually mine; I develop and maintain all things digital for this choir, including a digital music archive which is streamed from the PHP webapplication. Personally, I do have a VPS up and running (with that same hosting company) and am VERY happy running all stuff in Docker containers. Perhaps I should have a talk with them... :) Thanks for the food for thought, kind regards, |
Beta Was this translation helpful? Give feedback.
Found it!
When I either set
$defaultCompressionMethod
to::STORE
or (keep it::DEFLATE
and set$defaultDeflateLevel
to0
), the zip successfully streams in its entirety.Both the dummy-script and my ArchiveExport script now work as expected. For me, the issue is solved.
But I remain curious as to why on this particular shared hosting,
deflate
-ing with level-1
or1-9
lets the script terminate without any sign of trouble. Oh well...Thanks for your wonderful work, reading and thinking along in solving this!
FWieP