This repository has been archived by the owner on May 15, 2024. It is now read-only.
This repository has been archived by the owner on May 15, 2024. It is now read-only.
Closed
Description
In the current implementation, only when the size of a pack job is more than a threshold (default 16GB), it will be prepared by the dataset worker.
This means if the pack job never reaches 16GB, it will never get prepared, which can happen if the client has less than 16GB data or their last chunk of data is less than 16GB and they are no longer pushing new blobs.
The fix is to pack remaining files after a certain duration (24h, hardcoded or configurable)
Metadata
Assignees
Labels
Type
Projects
Status
Done