Skip to content

Too much CPU used when using multipart - should have a way to throttle upload speed ? #32

@farialima

Description

@farialima

This may not be easy to fix, but it's feedback, never bad to give...

I'm using glacier-cmd-interface to upload from DreamHost shared hosting to Amazon. however, for files that are bigger than 100MB I get:

(ve)[tigers]$ amazon-glacier-cmd-interface/glacier/glacier.py upload my_backups my_backup_file


Yikes! One of your processes (python, pid 14861) was just killed for excessive resource usage.                                                                                  
Please contact DreamHost Support for details.


Killed
(ve)[tigers]$ 

If the file is less than 100MB things are OK.

The process is killed while in:


    def make_request(self, method, path, headers=None, data='', host=None,
                     auth_path=None, sender=None, override_num_retries=None):
        headers = headers or {}
        headers.setdefault("x-amz-glacier-version","2012-06-01")
        return super(GlacierConnection, self).make_request(method, path, headers,
                                                           data, host, auth_path,
                                                           sender, override_num_retries)

So it may be that we are sending too much / too fast. I've tried to throtle CPU usage, but to no avail.

I would suggest to add a way to throttle the upload speed (as an option): I would suppose it would fix this, and be useful for many people (you don't want backup upload to take all the bandwidth...)

Probably not easy to implement - but who know...

Since this library seems very useful, I thought it was worth reporting any issue I have ! thank you for this lib.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions