Giter VIP home page Giter VIP logo

s3-parallel-put's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3-parallel-put's Issues

S3ResponseError

hello,I use s3-parallel-put and getting a trouble

python s3-parallel-put --bucket=reocar-test --host=192.168.0.191:7480 --log-filename=/tmp/s3pp.log --dry-run --limit=1 .
Traceback (most recent call last):
File "s3-parallel-put", line 459, in
sys.exit(main(sys.argv))
File "s3-parallel-put", line 430, in main
bucket = connection.get_bucket(options.bucket)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 509, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 542, in head_bucket
raise err
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

but i have this bucket and my bucket can upload file or down
I have been set access-key and secret-key in environment .

root@ceph1:~/s3-parallel-put-master# s3cmd ls
2018-08-02 05:46 s3://reocar-test

when i used

python s3-parallel-put --bucket=s3://reocar-test --host=192.168.0.191:7480 --log-filename=/tmp/s3pp.log --dry-run --limit=1 .

Traceback (most recent call last):
File "s3-parallel-put", line 459, in
sys.exit(main(sys.argv))
File "s3-parallel-put", line 430, in main
bucket = connection.get_bucket(options.bucket)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 509, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 556, in head_bucket
response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 400 Bad Request

my host is ceph rgw s3 not Amazon AWS S3

Did i miss some thing ? Thankyou !

301 Moved Permanently

Executing the command with an S3 bucket located in Sydney Australia (ap-southeast-1) throws an error.

root@vmd001 [/path/to/folder/test]# /path/to/folder/s3-parallel-put --bucket=vmd001 --put=add --insecure --dry-run --limit=1 .
Traceback (most recent call last):
File "/path/to/folder/s3-parallel-put", line 420, in
sys.exit(main(sys.argv))
File "/path/to/folder/s3-parallel-put", line 391, in main
bucket = connection.get_bucket(options.bucket)
File "/usr/lib/python2.6/site-packages/boto/s3/connection.py", line 502, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/lib/python2.6/site-packages/boto/s3/connection.py", line 549, in head_bucket
response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 301 Moved Permanently

Syntax error

Hey, Im trying to run it on our CI and I receive:

...
tools/s3-parallel-put: 1: tools/s3-parallel-put: --2019-07-24: not found
tools/s3-parallel-put: 2: tools/s3-parallel-put: Syntax error: "(" unexpected
Command exited with non-zero status 2

Any ideas?
The command:

sudo time tools/s3-parallel-put --quiet --processes=64 --put=stupid \
                    --bucket_region=s3-eu-central-1 --bucket=circleci-mim-results --prefix=test test.txt --verbose

Python 2.7.12

Add python-magic support

It would be useful to support python-magic. The builtin mimetypes library determines type solely based on filename extension, and may fail if the file if the extension is somehow abnormal.

In our case, we have a fairly large static website generated from a dynamic site, and a number of files contain trailing get queries.

--prefix does not work with absolute path

if I run :
s3-parallel-put --bucket=mybucket --prefix=myfiles data.txt
then s3://mybucket/myfiles/data.txt is present (as expected)

but if I run this (with an absolute path)
s3-parallel-put --bucket=mybucket --prefix=myfiles /data.txt
then I expect
s3://mybucket/myfiles/data.txt
but I get
s3://mybucket/data.txt

as a workaround I have to use relative path if I want to use a prefix

Connection reset by peer error

I'm getting this error
error: [Errno 104] Connection reset by peer
when trying to upload a file greater than 5GB.

for files less than 5GB, it works fine.

Does the processes option uploads different files in parallel. or same file is broken into chunks and uploaded in parallel.

"Broken pipe" when uploading

There seems to be a common issue when uploading to S3 of getting broken pipes.
Although people mention it for large files, I've experienced this consistently with a 91k file (perhaps its the filename, who knows).

The fix seems to be to pass the 'host' parameter in the S3 connection.
Changing

            if connection is None:
                connection = S3Connection(is_secure=options.secure)

to

HOST='s3-us-west-2.amazonaws.com'
<snip>
            if connection is None:
                connection = S3Connection(is_secure=options.secure, host=HOST)

Has resolved the problem for me.
Albeit, obviously not the proper fix.

This may not be an issue for your project, I'm just posting here so you can determine what action, if any, to take.

References:
fog/fog#824
boto/boto#621
http://reterwebber.wordpress.com/2013/08/22/broken-pipe-error-when-using-boto-s3/
boto/boto@75d5c7b#L0R340

Exception running s3-parallel-put

I'm getting this exception when I run:

./s3-parallel-put —-bucket-region=us-west-2 --bucket=my.bucket.name localfolder

Exception

File "/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 269, in match_hostname
    % (hostname, ', '.join(map(repr, dnsnames))))
CertificateError: hostname 'my.bucket.name.s3.amazonaws.com' doesn't match either of '*.s3.amazonaws.com', 's3.amazonaws.com'
  File "/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py", line 1212, in connect
    server_hostname=server_hostname)
  File "/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 350, in wrap_socket
    _context=self)
  File "/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 566, in __init__
    self.do_handshake()
  File "/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 796, in do_handshake
    match_hostname(self.getpeercert(), self.server_hostname)
  File "/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 269, in match_hostname
    % (hostname, ', '.join(map(repr, dnsnames))))
CertificateError: hostname 'my.bucket.name.s3.amazonaws.com' doesn't match either of '*.s3.amazonaws.com', 's3.amazonaws.com'
INFO:s3-parallel-put[statter-8316]:put 0 bytes in 0 files in 0.8 seconds (0 bytes/s, 0.0 files/s)

ERROR:s3-parallel-put:missing source operand

I'm getting the follow error when I execute the following command:

/home/user/s3-parallel-put-master/s3-parallel-put --bucket=photos --put=stupid --insecure --dry-run --limit=1

ERROR:s3-parallel-put:missing source operand

what am I missing?

DEBUG:boto:encountered error exception, reconnecting

Hey @mishudark ,

I'm getting this error when I'm trying to upload files from a mounted directory

`Thu, 03 Sep 2020 18:12:34 GMT
/s3-bucket-name/s3-bucket-sub/1-12750/Basel%20Images/JPC/1968Simplicity_005_BoxVI%2000054.jpg
DEBUG:boto:Signature:
AWS ********
DEBUG:boto:Final headers: {'Content-Length': '12009858', 'Content-MD5': 'nhtriFT3wzWRS9lpDxtRAQ==', 'Expect': '100-Continue', 'Date': 'Thu, 03 Sep 2020 18:12:34 GMT', 'User-Agent': 'Boto/2.49.0 Python/2.7.5 Linux/3.10.0-1127.19.1.el7.x86_64', 'Content-Type': 'image/jpeg', 'Authorization': u'AWS *******}
DEBUG:boto:encountered error exception, reconnecting
DEBUG:boto:establishing HTTPS connection: host=*****.s3.amazonaws.com, kwargs={'port': 443, 'timeout': 70}
DEBUG:boto:Token: None
DEBUG:boto:StringToSign:
PUT


image/jpeg`

However, I can upload a test folder on the mounted directory so I know I am able to use the tool to push files up.

`content-type` should default to guess

By now, I've accidentally uploaded multiple days worth of data to S3, only to find it unusable due to the content-type being application/octet-stream. This renders images, html, css, etc, unusable by default.

The default logic should be safe to use and therefore: the default content-type option should be guess.

This will need to take into account the gzip options logic to ensure it doesn't break gzip when no content-type is specified.

Unbounded memory usage

Tried to push one of my large-ish repositories to test (2 million-ish files, about 8.5TB). It made it about 2TB in and the server (a Sun Fire X4140 w/ 12 CPU cores and 16GB of RAM) ran out of memory and the push was killed by the kernel. Nothing else was running on the system, and system logs show that all physical memory and swap was eaten up by python.

This was using the "add" mode.

TypeError on iterating over headers

I'm getting a:

TypeError: 'NoneType' object is not iterable

on line 202 when I called:

./s3-parallel-putter --bucket=bucketname --prefix=foo somefolder/*

Aparently, the options.headers variable is a None when I call it this way. I can still make it work if I call it with a phony header:

./s3-parallel-putter --bucket=bucketname --prefix=foo --header=a:a somefolder/*

Python 3 Availability?

Python 2 is becoming more and more obsolete, things such as AWS CodeBuild and others are starting to remove Python2. Will this get adapted to Python3?

socket.gaierror: [Errno -2] Name or service not known

installed: python 2.7.6, boto

i am having this error:

Traceback (most recent call last):
File "/s3-parallel-put", line 410, in
sys.exit(main(sys.argv))
File "/s3-parallel-put", line 381, in main
bucket = connection.get_bucket(options.bucket)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 503, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 522, in head_bucket
response = self.make_request('HEAD', bucket_name, headers=headers)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 665, in make_request
retry_handler=retry_handler
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 1071, in make_request
retry_handler=retry_handler)
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 1030, in _mexe
raise ex
socket.gaierror: [Errno -2] Name or service not known

May i know how to resolve it?
Thanks.

s3-parallel-put needs a new maintainer

I have not personally used s3-parallel-put for several years. Although I'm happy to merge pull requests, I do not have the time or resources to test them or do further development.

If you would like to take over maintenance of the project, please comment here and I'll give you write access to the repository.

AWS_SECURITY_TOKEN support

Hi @mishudark ,

I've added an if/else clause to support connecting with tokens, this allows s3-parallel-put to work with AWS setups using Okta IDP, which supports authentication only via ephemeral token auth. If no session token variable is present, it will create the connection with the id/secret credentials as usual.

PR here: #55

How to configure?

How to setup the S3 Access Key, Secret Key and Bucket.
Also a command to copy from one folder to another?

Can anyone please help me

How does the '--put=add' parameter work?

Hi. I am using s3-parallel-put and its working very well, thank you. I have a question about the '--put=add' parameter which avoids re-uploading a file if it is already uploaded to s3. My question is, how sensitive is this? If I try to upload a file and a file with the same file name exists, will it upload, or does it check additional items such as size?

Copy filename without directory structure

I'm attempting to copy files from various subdirectories to the base of my S3 bucket. Here's an example:

~/s3-parallel-put --bucket=mybackups --prefix /backups/12413412/mysql-backup/backups/myappdb-02/xtrabackups /mydb-server-01-backup-20160926-091734.tar.gz

In this example I want the file mydb-server-01-backup-20160926-091734.tar.gz copied to the base of my bucket.

When I run s3-parallel-put I'm getting this:

INFO:s3-parallel-put[statter-46529]:put 0 bytes in 0 files in 0.0 seconds (0 bytes/s, 0.0 files/s)

Am I doing something wrong here?

SyntaxError: invalid syntax

I get the following error when I try to run s3-parallel-put on CentOS release 5.6 (Final)

(It works well for me on CentOS 6.3)

File "/bin/s3-parallel-put", line 81
with self.file_object_cache.open(self.filename) as file_object:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.