Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[6.x] Fix memory usage on downloading large files #31163

Merged
merged 2 commits into from
Jan 20, 2020
Merged

[6.x] Fix memory usage on downloading large files #31163

merged 2 commits into from
Jan 20, 2020

Conversation

sebsel
Copy link
Contributor

@sebsel sebsel commented Jan 17, 2020

Fixes issue #31159

@sebsel
Copy link
Contributor Author

sebsel commented Jan 17, 2020

I have just tested it again to be sure, but no: this really works for me and solves the problem.

To be fair, I'm testing on localhost with the local driver, but as I understand it other drivers shouldn't really matter.

@sebsel
Copy link
Contributor Author

sebsel commented Jan 17, 2020

Yes, as you understand it. Have you ran the code?

The following:

            $stream = $this->readStream($path);
            while(! feof($stream)) {
                echo fread($stream, 2048);
            }
            fclose($stream);

            \Log::info(memory_get_peak_usage());

... with a file download of 1,04 GB. Gives me 21 MB in the log:

[2020-01-17 18:41:54] local.INFO: 21088296

Do you have an example in which it does not work? (I want to learn too here.)

@sebsel
Copy link
Contributor Author

sebsel commented Jan 17, 2020

Very interesting, thanks!

The way I tested it was through a File Nova-field download. I can reproduce that error too on my Mac here, when I run it in a tinker-session: it streams the first part, and then blows up midway.

From the context I get here, however, I think it's Tinker itself that blows up. See the reference to 'php-console-highlighter':

   Symfony\Component\Debug\Exception\FatalErrorException  : Allowed memory size of 134217728 bytes exhausted (tried to allocate 109051936 bytes)

  at /Users/sebastiaan/code/orca/vendor/psy/psysh/src/Shell.php:936
PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 266240 bytes) in /Users/sebastiaan/code/orca/vendor/jakub-onderka/php-console-highlighter/src/Highlighter.php on line 113
PHP Fatal error:  Unknown: Cannot use output buffering in output buffering display handlers in Unknown on line 0

The crash in the console is also in the log, but there is no mention of crashes in the logs when I go through the browser, so I am still convinced my fix works.

@sebsel
Copy link
Contributor Author

sebsel commented Jan 17, 2020

@voyula It's not bad to be wrong. I was wrong too, because I thought you did not run the code and you did and you had a good point. Don't delete your comments. Thanks for the research!

@driesvints driesvints changed the title fix memory usage on downloading large files [6.x] Fix memory usage on downloading large files Jan 17, 2020
@ghost
Copy link

ghost commented Jan 17, 2020

@sebsel Yes, it seems working now, magically. But output buffering can affect it, like to fpassthru, i don't personally sure for this solution.

I did delete my uncertain suggestions, Laravel core members can give a feedback for this issue. Thanks for your explanations.

A Benchmarking Source:
https://www.garfieldtech.com/blog/readfile-memory

@@ -162,7 +162,9 @@ public function response($path, $name = null, array $headers = [], $disposition

$response->setCallback(function () use ($path) {
$stream = $this->readStream($path);
fpassthru($stream);
while (! feof($stream)) {
echo fread($stream, 2048);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we know why this "dark magic hackery" fixes it 🧙‍♂️ ?

Copy link

@ghost ghost Jan 20, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like output buffering tries chunking data with php free memory size.
If you will give data upper size than free memory size, output buffering chunking attempts with php free memory size will be failed.
This while giving data with small sizes to output buffer for help to output buffering chunking via php free memory size.

This issue was soo interesting for me. :D

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link

@ghost ghost Feb 25, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not stream_copy_to_stream? Eg https://github.com/symfony/http-foundation/blob/0f977dd6a6ab22789f49cefccdb623734da2ba3c/BinaryFileResponse.php#L297-L303

Actually, functionally same thing. Only having an different point, while is chunking data for output buffering.

@taylorotwell
Copy link
Member

Reverting for breaking downloads.

@LucaColombi
Copy link

Reverting for breaking downloads.

hello, had the same problem, memory overflow while handling a large set of data, in
......... /php-console-highlighter/src/Highlighter.php on line 113

using "laravel/framework": "5.8"

so, this fix does not work because it break something else? Or can I apply it? have to apply it manually in sources?

@driesvints
Copy link
Member

@LucaColombi 5.8 isn't supported anymore, please upgrade.

@rtrudel
Copy link

rtrudel commented Jul 3, 2020

Same problem with Laravel 7. No fix yet?

@driesvints
Copy link
Member

@rtrudel you're free to do another attempt if you like.

@ricksonoliveira
Copy link

Going through the same problem, due to the revert it wasn't fixed yet, any update about whether it's going to be fixed in next versions? I'm thinking on overriding the method for now, anyone got some work around?

@ricksonoliveira
Copy link

Going through the same problem, due to the revert it wasn't fixed yet, any update about whether it's going to be fixed in next versions? I'm thinking on overriding the method for now, anyone got some work around?

I'd like to share a workaround I got based on the "fix" @sebsel introduced.
If your using streamDownload() ,stream() or even setCallback() methods for downloading big files, instead of just echoing the file path on the download method callback, just use something like this inside the callback of the methods mentioned:

return response()->streamDownload(function () use ($filePath) {
                    $fd = fopen($filePath, 'rb');
                    while(!feof($fd)) {
                        echo fread($fd, 2048);
                    }
                }, $fileName);

The above is chunking the stream of the file instead of loading it through php memory, as mentioned before by @sebsel.
I'm just sharing this so people won't be thinking it's a Laravel bug in which we can't do nothing about it.
Using the above means your isolating the download to chunks only in the method you need it.
Just trying to help.

Got this fix here: https://www.php.net/manual/en/function.fpassthru.php#29162

@BlackbitDevs
Copy link

BlackbitDevs commented Jan 5, 2023

Reverting for breaking downloads.

What exactly did this change break, @taylorotwell ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants