Skip to content

Retrying failed jobs causes PHP memory exhaustion errors when dealing with thousands of failed jobs  #49185

Closed
@arharp

Description

Laravel Version

10.34.2

PHP Version

8.1.26

Database Driver & Version

No response

Description

A project I'm working on has 300k failed jobs. When I try to run artisan queue:retry it results in an error...

PHP Fatal error: Allowed memory size of 536870912 bytes exhausted

After looking into how the RetryCommand works, I realized that rather than querying just the IDs from the failed jobs table, it loads the entire table and then plucks the ID column. No matter how high I set the memory limit, the server won't be able to handle loading all 300k failed jobs into memory.

This is a very inefficient way to load the failed job IDs. Any thoughts on how this could be improved without breaking anything?

Steps To Reproduce

  • Populate the failed_jobs table with 300k failed jobs
  • Run php artisan queue:retry

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions