Copying big files to S3

Was backing up a server pre-upgrade and wanted to upload the bzip to s3. Got some strange throttle errors from s3cmd, my go-to command line client.

Found a cool link on stackoverflow about a tool bundle provided by aws

Synopsis is here:

$ wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip
$ unzip awscli-bundle.zip
$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
$ aws configure

source: http://stackoverflow.com/questions/5774808/s3cmd-failed-too-many-times

2 thoughts on “Copying big files to S3

  • You should create a global group specifically for
    these users. You would have to have 1950’s style innocence to believe the government would somehow
    have discovered his deception all because he was registered in a bloated government database.
    It was not possible for them to gather information about all the colleges
    and most of the time they didn’t knew about the list of all the colleges that existed.

Comments are closed.