You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the Bug
Memory overflow occurs when handling large amounts of data.
When using the Rabbit Hole with big data, the container's memory is not released properly. This leads to the container or even the host (if no memory limit is set) crashing. My Cashier Cat instance is using an external Qdrant DB.
To Reproduce
Steps to reproduce the behavior:
Set the memory limit of the container to, for example, 2 GB.
Navigate to Cashier Cat's home page.
Upload one document (e.g., a 300MB PDF).
Check the logs to ensure everything is functioning correctly.
Upload one or multiple documents (uploading four 300MB PDFs was sufficient for me).
Monitor the RAM consumption of the container and review the logs.
Wait until the process crashes.
Logs When Uploading Multiple Documents and Crashing
Below are the logs showing the latest successful upload before the crash:
Expected Behavior
Cashier Cat should manage memory efficiently, releasing it appropriately to prevent memory leaks and crashes.
Additional Context
The issue persists regardless of whether the container's memory limit is set to 2 GB or 12 GB; the crash occurs at a later point. I attempted to optimize rabbit_hole.py, but I'm not sure if the problem originates there. Attached is a graph showing memory consumption over time.
Workaround
Split large PDFs into smaller MD files.
Upload a batch of files, such as 15 files.
Restart Cashier Cat.
Repeat the process as needed.
Memory usage over time.
The text was updated successfully, but these errors were encountered:
Describe the Bug
Memory overflow occurs when handling large amounts of data.
When using the Rabbit Hole with big data, the container's memory is not released properly. This leads to the container or even the host (if no memory limit is set) crashing. My Cashier Cat instance is using an external Qdrant DB.
To Reproduce
Steps to reproduce the behavior:
Logs When Uploading Multiple Documents and Crashing
Below are the logs showing the latest successful upload before the crash:
Expected Behavior
Cashier Cat should manage memory efficiently, releasing it appropriately to prevent memory leaks and crashes.
Additional Context
The issue persists regardless of whether the container's memory limit is set to 2 GB or 12 GB; the crash occurs at a later point. I attempted to optimize
rabbit_hole.py
, but I'm not sure if the problem originates there. Attached is a graph showing memory consumption over time.Workaround
Memory usage over time.
The text was updated successfully, but these errors were encountered: