azure / azure-storage-azcopy Goto Github PK
View Code? Open in Web Editor NEWThe new Azure Storage data transfer utility - AzCopy v10
License: MIT License
The new Azure Storage data transfer utility - AzCopy v10
License: MIT License
10.0.1-Preview
Windows & Linux
azcopy /SetContentType
ContentType detection makes use of the built-in http.DetectContentType in Go, however this does not correctly identify the content type of JSON files (as well as JavaScript files and SVG images). This is particularly problematic when uploading material to be used in a static website. See the following for detail of the problem caused and some work-arounds:
https://liftcodeplay.com/2017/11/28/how-to-fix-azure-storage-blob-content-types/
Use azcopy on a test.json file and inspect the content type, set as application/octet-stream which then prevents the JSON from loading correctly in a client browser. Copy the same file using the Azure Storage Explorer and the correct content type is detected.
Two approaches to mitigation;
AzCopy\AzCopy.exe" /Source:"" /Dest:"https://my.blob.core.windows.net/$web" /Pattern:"*.json" /S /Y /SetContentType:"application/json" /Z /V
A better solution for customers would be to have some additional extra file extension based ContentType heuristics in AzCopy for common miss-detected types. The JSON mime-type issue is particularly problematic and hard for end-users to debug as a content type problem.
My plan is to use azcopy
as part of CI jobs to upload release assets, etc.
I'd like to be able to use this in scenarios where the azure storage container or account might be abstracted or parameterized. In this case, it would be ideal to be able to say something along the lines of: azcopy ensure-container --account-name=strgacct1 --container-name=release-assets
.
Possible downside: I'm guessing that this tool only deal with Storage auth right now, rather than generic ARM auth. Maybe this isn't a full downside given the changes to Azure's Storage auth with RBAC, etc.
Azcopy 10.0.2 Preview - Win 7 - Powershell
The following string works for a "copy" but when using sync it throws the error further below
.\azcopy sync "C:\GCDS_dev" "https://azgcdsdevst1.file.core.windows.net/gcdsbuild/GQtest?<retracted_key>" --recursive
error parsing the input given by the user. Failed with error source 'C:\GCDS_dev' / destination 'https://azgcdsdevst1.fi
le.core.windows.net/gcdsbuild/GQtest?"retracted_key"' combination 'LocalFile' not supported for sync command
Any Ideas?
Hello,
Is it possible for the Sync command to provide support in synchronizing contents between two storage accounts?
Thank you.
considering to use v10
Linux
it's the question.
I want to use proxy as in Windows
(i.e. https://blogs.msdn.microsoft.com/azure4fun/2016/10/17/azcopy-unable-to-connect-to-the-remote-server/ )
Seen the code, it looks to use the system proxy. can't user change as they want?
https://github.com/Azure/azure-storage-azcopy/blob/master/common/oauthTokenManager.go#L69
If there are any way to set the own proxy, I really want to know how. Thanks,
/CheckMD5 is part of the v8.1 of Azcopy and help us to avoid data corruption when downloading very large file from Azure blob.
It would be great to have this feature added as part of this next Azcopy command line.
Hey,
would be nice to transfer files with azcopy which are read by STDIN, like:
cat MYFILE | azcopy cp "https://myaccount.blob.core.windows.net/mycontainer/1file.txt?sastokenhere"
This is already implemented in azcopy v7.
Best regards,
Jonas
10.0.2 Preview - Win 7
Copy fails when I try a local file copy to-> azure file storage with the 10.0.2 preview. Works with current 8.1.0
this is the 8.1.0 line
azcopy /source:C:\GCDS_dev /dest:/https://azgcdsdevst1.file.core.windows.net/gcdsbuild/GQtest /DestKey:<retracted> /s /BlockSizeInMB:32
converted to 10.0.2 preview
azcopy copy C:\GCDS_dev https://azgcdsdevst1.file.core.windows.net/gcdsbuild/GQtest?<retracted_sas> --recursive=true --block-size=32
Results is all files fail with 404 errors in the log... any ideas?
RESPONSE Status: 404 The specified resource does not exist.
Content-Length: [223]
Content-Type: [application/xml]
Date: [Mon, 08 Oct 2018 14:18:06 GMT]
Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [ResourceNotFound]
X-Ms-Request-Id: [7ece6fe9-501e-0107-1c11-5f316f000000]
X-Ms-Version: [2018-03-28]
2018/10/08 14:18:07 JobID=5d651ae0-11f9-284d-4ea1-9e641861b502, Part#=0, TransfersDone=24 of 25
2018/10/08 14:18:09 ==> REQUEST/RESPONSE (Try=1/8.618s[SLOW >3s], OpTime=8.624s) -- RESPONSE SUCCESSFULLY RECEIVED
DELETE https://azgcdsdevst1.blob.core.windows.net/gcdstest2/GQtest/GCDS_dev/Plotters/PLT_CAN_Bdy.dwg?"retracted_sas"%3D&timeout=901
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
X-Ms-Client-Request-Id: [a661a951-830b-41ce-4a8a-7758ddfc5812]
X-Ms-Version: [2018-03-28]
Command: azcopy cp "source" "destination?SAS" --recursive=true
Where SAS was incorrect.
Command enumerated local files and then starting increasing failure after failure. If the SAS key isn't correct on the first PUT, why continue to process the list ?
Ran a copy command that ended in 1000's of errors. An example from the log follows. I assume it was due to slow upload network (from home machine) so, I changed to AZCOPY_CONCURRENCY_VALUE=10 and so far haven't reproed. Given the concurrency is an optional env var, I'd expect AZCOPY to dynamically reduce the number of threads (or, determine the upload throughput first) or, provide a clear error message so the user can manually set correctly.
Summary:
.log file created in C:\Users\klaas/.azcopy
519 Done, 904 Failed, 4322 Pending, 0 Skipped 5745 Total , 2-sec Throughput (Mb/s): 5.7147
Job 42742d06-ee4e-a143-7458-16f358102298 summary
Elapsed Time (Minutes): 1009.0905
Total Number Of Transfers: 5745
Number of Transfers Completed: 519
Number of Transfers Failed: 904
Number of Transfers Skipped: 0
TotalBytesTransferred: 798344735
Final Job Status: Cancelled
Put https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=MTM1ODQyMmItYjYyYS00MzQ5LTZiOTUtZWJkYWQ0MjFjNmQ1&comp=block&se=2018-10-15T09%3A50%3A29Z&sig=3W7BpmoZYi8i6OXRlFg3mXhxQMIiEjLtUlR1ZR6jCN8%3D&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15T01%3A50%3A29Z&sv=2017-11-09&timeout=601: net/http: HTTP/1.x transport connection broken: write tcp 192.168.1.48:55519->52.183.104.36:443: wsasend: An existing connection was forcibly closed by the remote host.
PUT https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=mtm1odqymmityjyyys00mzq5ltziotutzwjkywq0mjfjnmq1&comp=block&se=2018-10-15t09%3A50%3A29z&sig=REDACTED&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15t01%3A50%3A29z&sv=2017-11-09&timeout=601
PUT https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=mtm1odqymmityjyyys00mzq5ltziotutzwjkywq0mjfjnmq1&comp=block&se=2018-10-15t09%3A50%3A29z&sig=REDACTED&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15t01%3A50%3A29z&sv=2017-11-09&timeout=601
Put https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=MTM1ODQyMmItYjYyYS00MzQ5LTZiOTUtZWJkYWQ0MjFjNmQ1&comp=block&se=2018-10-15T09%3A50%3A29Z&sig=3W7BpmoZYi8i6OXRlFg3mXhxQMIiEjLtUlR1ZR6jCN8%3D&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15T01%3A50%3A29Z&sv=2017-11-09&timeout=601: net/http: HTTP/1.x transport connection broken: write tcp 192.168.1.48:55653->52.183.104.36:443: wsasend: An existing connection was forcibly closed by the remote host.
2018/09/15 03:59:27 ERR: [P#0-T#169] UPLOADFAILED: d:///Users/klaas/My%20Pictures/2018/JUne/DSCF1731.JPG : 000 : Chunk Upload Failed -> github.com/Azure/azure-storage-azcopy/ste.newAzcopyHTTPClientFactory.func1.1, /go/src/github.com/Azure/azure-storage-azcopy/ste/mgr-JobPartMgr.go:95
PS C:\Program Files\azcopy_windows_amd64_10.0.2> .\azcopy.exe cp 'https://testac1.file.core.windows.net/master' 'https://testac1.blob.core.windows.net/master' --recursive=true
failed to parse user input due to error: Unable to infer the source 'https://testac1.file.core.windows.net/master' / destination 'https://testac1.blob.core.windows.
net/master' combination. Please use the --FromTo switch
I'm assuming that --overwrite:false will not overwrite matching blobs in the destination but, the /XO option of the previous azcopy was better (in the case for instance where you are doing multiple copies through time with the same source and destination path).
AzCopy 10.0.3-Preview
`./azcopy cp https://premiumblobfuse.blob.core.windows.net/bigdata/1GBs/myfile1 /mnt
Scanning...
Using OAuth token for authentication.
failed to perform copy command due to error: cannot start job due to error: cannot download the enitre container / virtual directory. Please use --recursive flag.
`
Hi
I first reported this here
MicrosoftDocs/azure-docs#13735 (comment)
And am now moving it to this repo as it is more appropriate.
The readme suggests that there should be links for 3 downloads
Download the AzCopy executable using one of the following links:
Windows x64 Linux x64 MacOS x64
But there are no links, so I attempted to build on MacOS myself.
Having no experience with GO, I installed Go and added one by one the respective dependencies
until I encountered the following error.
common/credCache_darwin.go:28:2: cannot find package "github.com/jiacfan/keychain" in any of:
/usr/local/opt/go/libexec/src/github.com/jiacfan/keychain (from $GOROOT)
/Users/hrant/go-workspace/src/github.com/jiacfan/keychain (from $GOPATH)
cmd/cancel.go:28:2: cannot find package "github.com/spf13/cobra" in any of:
/usr/local/opt/go/libexec/src/github.com/spf13/cobra (from $GOROOT)
/Users/hrant/go-workspace/src/github.com/spf13/cobra (from $GOPATH)
common/credCache_linux.go:26:2: cannot find package "github.com/jiacfan/keyctl" in any of:
/usr/local/opt/go/libexec/src/github.com/jiacfan/keyctl (from $GOROOT)
/Users/hrant/go-workspace/src/github.com/jiacfan/keyctl (from $GOPATH)
cmd/cancel.go:28:2: cannot find package "github.com/spf13/cobra" in any of:
/usr/local/opt/go/libexec/src/github.com/spf13/cobra (from $GOROOT)
/Users/hrant/go-workspace/src/github.com/spf13/cobra (from $GOPATH)
cmd/cancel.go:28:2: cannot find package "github.com/spf13/cobra" in any of:
/usr/local/opt/go/libexec/src/github.com/spf13/cobra (from $GOROOT)
/Users/hrant/go-workspace/src/github.com/spf13/cobra (from $GOPATH)
I can't find github.com/jiacfan/keychain to add that dependency.
Please kindly either provide links for builds or advise on a proper building steps.
Many thanks in advance
Try to use the new Go version of azcopy.
Docker, Linux container
For now, I'm building my own docker image base on this repository but it's not automated.
Hello,
It's not a bug report. We faced with an issue on Ubuntu with v9 (MicrosoftDocs/azure-docs#12554) and I don't believe it will be fixed
As a result- it's interesting should we already try v10 or it's too early stage to talk about GA dates?
I am brand new to AzCopy so forgive me, but I have reviewed the readme.md and really tried to figure this one out. Eventually I'd like to test sync
but I am stuck at the starting gate.
ps> azcopy.exe login
Login succeeded.
ps> azcopy.exe list https://mystorage.blob.core.windows.net/backups
List is using OAuth token for authentication.
cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:db87066e-c01e-0173-67d6-5fb0b6000000
Time:2018-10-09T13:43:58.0479030Z, Details:
AuthenticationErrorDetail: Issuer validation failed. Issuer did not match.
GET https://mystorage.blob.core.windows.net/backups?comp=list&restype=container&timeout=901
Authorization: REDACTED
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
X-Ms-Client-Request-Id: [8a047954-e7bf-4d30-77b0-ce3ea113e728]
X-Ms-Version: [2018-03-28]
--------------------------------------------------------------------------------
RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [422]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 13:43:57 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [db87066e-c01e-0173-67d6-5fb0b6000000]
Is this expected behavior/output of the cat command without having grep in the pipeline?
PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> cat "C:\Users\artek.azcopy\ed3dc4a0-8949-9d4f-57dc-f8fccaa7bd4d.log"
2018/10/09 18:00:01 AzcopVersion 10.0.2-Preview
2018/10/09 18:00:01 OS-Environment windows
2018/10/09 18:00:01 OS-Architecture amd64
2018/10/09 18:10:14 AzcopVersion 10.0.2-Preview
2018/10/09 18:10:14 OS-Environment windows
2018/10/09 18:10:14 OS-Architecture amd64
V10.0.2 Preview - Win 7
.\azcopy sync "C:\GCDS_dev" "https://azgcdsdevst1.blob.core.windows.net/gcdstest2?--Key Retracted--" --recursive
When syncing larger amounts of data >1GB (local file to Blob) sync seems to take a long time to even prep the job. (i.e syncing 1.4gb of data seems to take greater than 30 mins to even srart the job)
While copy function seems to start almost straight away.
I know the sync command obviously has some file comparison work to do before it can do anything, but it still seems extraordinarily slow to begin.
Any idea what could be causing delay?
Is it possible to report file conflict check progress to the command line with a flag?
10.0.1-Preview
Mac
./azcopy copy "/Users/mrayermann/Downloads/*" "https://REDACTED.blob.core.windows.net/testcontainer/?REDACTED --recursive --overwrite=false --include "test.vhd;"
From the output, it looks like the job never finishes. But, if I look at the container via Storage Explorer, I can see the file in my container.
touch test.vhd
Quit AzCopy after a few minutes, with the assumption that the transfer probably finished.
10.0
Windows
Azcopy sync now supports file system to blob, it should also support blob to blob sync.
10.0.2-preview
Windows
Currently, VHDs are always uploaded as page blobs, and everything else is always uploaded as block blobs. It'd be nice if there was a way to override those defaults.
V10.0.2 - Win 7
what are the possible string options for "azcopy sync" --log-level flag?
"WARNING" is default, but it seems that the logs stored in c:\users<username>.azcopy store more than just warnings?
Some of these logs get quite large. 1.4GB copy made a 100mb log!!
so decreasing the verbosity would be certainly useful if azcopy was to run on a regular basis?
I see this on Linux and OSX.
[admin@ip-0A05000B ~]$ azcopy --version
azcopy version 10.0.4-Preview
[admin@ip-0A05000B ~]$ azcopy login
To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code BLC4LFLT9 to authenticate.
Login succeeded.
[admin@ip-0A05000B ~]$ azcopy ls "https://requawestus2.blob.core.windows.net/cyclecloud"
List is using OAuth token for authentication.
cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthorizationPermissionMismatch) =====
Description=This request is not authorized to perform this operation using this permission.
RequestId:62af0922-301e-0085-2004-7b7e29000000
Time:2018-11-13T03:54:05.3116844Z, Details:
Code: AuthorizationPermissionMismatch
GET https://requawestus2.blob.core.windows.net/cyclecloud?comp=list&restype=container&timeout=901
Authorization: REDACTED
User-Agent: [AzCopy/10.0.4-Preview Azure-Storage/0.3 (go1.10.3; linux)]
X-Ms-Client-Request-Id: [99b8342d-573d-47a7-6e15-34888aa24380]
X-Ms-Version: [2018-03-28]
--------------------------------------------------------------------------------
RESPONSE Status: 403 This request is not authorized to perform this operation using this permission.
Content-Length: [279]
Content-Type: [application/xml]
Date: [Tue, 13 Nov 2018 03:54:05 GMT]
Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthorizationPermissionMismatch]
X-Ms-Request-Id: [62af0922-301e-0085-2004-7b7e29000000]
X-Ms-Version: [2018-03-28]
This is a complete reproduction after install on OSX and linux. There are no other options on the ls command - such as saskey for auth.
10.0.1
Linux
go get github.com/Azure/azure-storage-azcopy
go get github.com/Azure/azure-storage-azcopy :(
# github.com/Azure/azure-storage-azcopy/ste
go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-localToBlockBlob.go:336:36: not enough arguments in call to blockBlobUrl.StageBlock
have (context.Context, string, io.ReadSeeker, azblob.LeaseAccessConditions)
want (context.Context, string, io.ReadSeeker, azblob.LeaseAccessConditions, []byte)
go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-localToBlockBlob.go:582:37: not enough arguments in call to pageBlobUrl.UploadPages
have (context.Context, int64, io.ReadSeeker, azblob.BlobAccessConditions)
want (context.Context, int64, io.ReadSeeker, azblob.PageBlobAccessConditions, []byte)
go get github.com/Azure/azure-storage-azcopy :(
10.0.0 worked fine for me
10.0.1
Linux
./azcopy_linux_amd64 cp 'https://<fileshare:.file.core.windows.net//?<redacted_sas>&sharesnapshot=' '<different_azure_file_share>' --recursive=true
000 : File Creation Error operation not supported
If I change the path from the CIFS share to the local directory the command works (i.e. saving onto the VM itself). But if I make the path something on a new empty windows cifs fileshare (i.e. restoring my snapshot onto a fresh share) I get errors
No. But I believe it to be related to allocating memory for the file here
azure-storage-azcopy/common/mmf_linux.go
Line 99 in 500f2ee
Is there a way to set properties on uploaded files, such as HTTP x-ms-blob-cache-control?
AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)
Windows 10
Cannot SSH to CentOS 7.5 VM after installed azcopy 7.3.0-netcore
Found the installation will change the permission of user home directory from 700 to 775
create VM using Azure gallery image CentOS 7.5
"imageReference": {
"publisher": "OpenLogic",
"offer": "CentOS",
"sku": "7.5",
"version": "latest"
SSH to the VM
before installing azcopy, found the permission of user home directory is 700
[holgo@azcopy home]$ date; ls -al
Wed Sep 26 08:12:12 UTC 2018
total 0
drwxr-xr-x. 3 root root 19 Sep 26 08:06 .
dr-xr-xr-x. 17 root root 224 Aug 15 20:04 ..
drwx------. 5 holgo holgo 124 Sep 26 08:10 holgo
Install azcopy
wget -O azcopy.tar.gz https://aka.ms/downloadazcopylinux64
tar -xf azcopy.tar.gz
sudo ./install.sh
sudo yum install -y libunwind icu
Now issue can be reproduced where any further SSH attempt will fail with error saying:
Permission denied (publickey,gssapi-keyex,gssapi-with-mic)
From /var/log/secure, found error pointing to bad ownership or modes for directory /home/holgo
[holgo@azcopy home]$ sudo tail -f /var/log/secure
...
Sep 26 08:16:05 localhost sshd[1831]: Authentication refused: bad ownership or modes for directory /home/holgo
....
[holgo@azcopy home]$ date ; ls -al
Wed Sep 26 08:14:52 UTC 2018
total 0
drwxr-xr-x. 3 root root 19 Sep 26 08:06 .
dr-xr-xr-x. 17 root root 224 Aug 15 20:04 ..
drwxrwxr-x. 6 holgo holgo 177 Jul 19 04:56 holgo
[holgo@azcopy home]$
"sudo chmod -R 700 holgo/" will mitigate the issue
Allows user to see what transfers would happen without actually performing the transfers
10.0.3-preview
Windows
.\azcopy.exe copy "..\..\1234567891234567981234567981321321654684354645313216843513213223111141445 45474567271271727231731324175417234173241732417324173aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaabbb" "https://REDACTED.blob.core.windows.net/uploadtohere?REDACTED"
On upload, AzCopy failed with the error: failed to perform copy command due to error: cannot start job due to error: cannot find source to upload.
For download, it seems like AzCopy can handle slightly longer paths, but there is still a point where it fails.
Create a file, whose full path is at least 250ish characters long. Try to transfer it with AzCopy.
No.
As the README.md exists today, I don't really understand the following:
cp
vs sync
sync
. How is the sync achieved? Is it two-way? How is integrity checked? Are timestamps used, are only missing files used?sync
between two blob stores?As an aside, what do I do if I do not want "job" behavior? I have things arranged such that uploads/syncs should be idempotent and hopefully shouldn't need to keep state around. Plus, I'm exclusively using this in CI scenarios where state won't be persisted between runs anyway.
C:\Users\seguler\Desktop\azcopy_windows_amd64_10.0.0\azcopy_windows_amd64_10.0.0>.\azcopy_windows_amd64.exe jobs resume 8f925120-9bb6-b540-707e-4be86899b4ff --destination-sas="REDACTED"
Job 8f925120-9bb6-b540-707e-4be86899b4ff has started
8f925120-9bb6-b540-707e-4be86899b4ff.log file created in C:\Users\seguler/.azcopy
1350 Done, 0 Failed, 0 Skipped, 58065 Pending, 59415 Total
So sync works correctly with sastoken but whatever is in the container prior to the first sync gets deleted/overridden. Can we append instead?
BTW: during the 1st run sync detected 5 files in the folder although there was just 2 (which was correctly reflected in the container after the sync). I then deleted 1 file and ran the sync again and that time it correctly detected 1 file and correctly deleted 1 file from the container.
PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" "https://akzrsdemo.blob.core.windows.net/aktest?sv=RSv%2BW5Y%3D" --recursive=true
Job 08599522-c810-624a-6d48-faf60bd9ba7d has started
08599522-c810-624a-6d48-faf60bd9ba7d.log file created in C:\Users\artek/.azcopy
0 Done, 0 Failed, 5 Pending, 0 Skipped, 5 Total, 2-sec Throughput (Mb/s): 0
Job 08599522-c810-624a-6d48-faf60bd9ba7d summary
Elapsed Time (Minutes): 0.0334
Total Number Of Transfers: 5
Number of Transfers Completed: 5
Number of Transfers Failed: 0
Number of Transfers Skipped: 0
TotalBytesTransferred: 15612824
Final Job Status: Completed
PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" "https://akzrsdemo.blob.core.windows.net/aktest?sv=Y%3D" --recursive=true
Job 1f756979-1816-dd4f-4af4-db129ae90cc2 has started
1f756979-1816-dd4f-4af4-db129ae90cc2.log file created in C:\Users\artek/.azcopy
0 Done, 0 Failed, 1 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 0
Job 1f756979-1816-dd4f-4af4-db129ae90cc2 summary
Elapsed Time (Minutes): 0.0333
Total Number Of Transfers: 1
Number of Transfers Completed: 1
Number of Transfers Failed: 0
Number of Transfers Skipped: 0
TotalBytesTransferred: 14944
Final Job Status: Completed
Running Win 7. does new Azcopy sync support (local files) to (file storage)? Or does it only support (blob storage)?
Syncing from local file (C:\GQ) to (Azure File Storage "File Share")
so something like below , should work?
azcopy sync c:\GQ https:\\<azure file storage path> --recursive
10.0.0.3
Linux
azcopy cp "https://CENSORED.blob.core.windows.net/CONTAINER1?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-10-25T21:22:33Z&st=2018-10-22T13:22:33Z&spr=https&sig=CENSORED" "https://CENSORED.blob.core.windows.net/CONTAINER2?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-10-25T21:22:33Z&st=2018-10-22T13:22:33Z&spr=https&sig=CENSORED" --recursive=true
The blob metadata for the original container had 2 keys:
encryptedvalue
filesize
after the transfer the 2 keys were changed to:
Encryptedvalues
Filesize
Run the command to transfer the files same as above then compare the metadata fields. In the Azure portal the metadata fields look the same, but when you check in Azure storage explorer you can see they are different.
When you copy without the shared access signature and instead use the keys themselves it doesn't occur.
AzCopy 8.1.0-netcore
Windows
The "Microsoft Azure Storage AzCopy" shortcut installed
When reviewing the updated path in the new "Microsoft Azure Storage AzCopy" command prompt window, the path is incorrect. It has an extra AzCopy at the end.
C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy
C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy>cd \
C:\>azcopy
'azcopy' is not recognized as an internal or external command,
operable program or batch file.
C:\>
Fix the 'set PATH' statement in C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\LaunchCmd.cmd
10.0.2-preview
Windows
I ran
copy "C:\Users\marayerm\Desktop\*" "https://redacted.blob.core.windows.net/one/?REDACTED" --overwrite=false --follow-symlinks --recursive --fromTo=LocalBlob --include "New Text Document.txt;" --output=json
when there is no file at C:\Users\marayerm\Desktop\New Text Document.txt
Although output=json
was specified, the output I received was failed to perform copy command due to error: cannot start job due to error: nothing can be uploaded, please use --recursive to upload directories.
, which is not JSON. I have seen other situations where this happens, but this is the easiest to reproduce. Basically, if output=json
is used, then all output should be formatted as JSON objects.
Try to do a copy where the source does not exist.
No.
We are migrating to Azure, but we have a lot of files. I have a copy script for every file extension we want to copy to blob storage. We are using AzCopy 8. For example png files:
For regular png files we get the result:
Finished 1252 of total 1252 file(s).
[2018/08/24 00:36:15] Transfer summary:
-----------------
Total files transferred: 1252
Transfer successfully: 1252
Transfer skipped: 0
Transfer failed: 0
Elapsed time: 00.00:00:26
This command however
.\AzCopy.exe /@:"C:\Path\inputParams.txt" /Pattern:"*.PNG /V:"C:\LogPath\png-cap.log"
inputParams.txt is this:
/Source:"D:\RootPath"
/Dest:https://blip.blob.core.windows.net/root/
/DestKey:{omitted}==
/SetContentType
/NC:4
/S
/XO
/Y
Prints this
Finished 0 of total 0 file(s).
[2018/08/24 00:36:32] Transfer summary:
-----------------
Total files transferred: 0
Transfer successfully: 0
Transfer skipped: 0
Transfer failed: 0
Elapsed time: 00.00:00:17
The output of C:\LogPath\png-cap.log is:
[2018/08/24 00:36:15.362+02:00] >>>>>>>>>>>>>>>>
[2018/08/24 00:36:15.378+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:15.378+02:00][VERBOSE] 8.0.0 : AzCopy /@:C:\Path\inputParams.txt /Pattern:*.PNG /V:C:\LogPath\png-cap.log
[2018/08/24 00:36:15.471+02:00][VERBOSE] Attempt to parse address 'D:\RootPath' to a directory as a candidate location succeeded.
[2018/08/24 00:36:15.471+02:00][VERBOSE] Source is interpreted as a Local directory: D:\RootPath\.
[2018/08/24 00:36:15.502+02:00][VERBOSE] Attempt to parse address 'https://blip.blob.core.windows.net/root/' to a directory as a candidate location succeeded.
[2018/08/24 00:36:15.518+02:00][VERBOSE] Attempt to parse address 'https://blip.blob.core.windows.net/root/' to a single file location failed: Invalid location 'https://blip.blob.core.windows.net/root/', cannot get valid account, container and blob name.
[2018/08/24 00:36:15.518+02:00][VERBOSE] Destination is interpreted as a Cloud blob directory: https://blip.blob.core.windows.net/root/.
[2018/08/24 00:36:20.370+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:25.377+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:30.369+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:32.647+02:00] Transfer summary:
-----------------
Total files transferred: 0
Transfer successfully: 0
Transfer skipped: 0
Transfer failed: 0
Elapsed time: 00.00:00:17
Is there a way to have the files with capital letters as an extension uploaded?
PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" https://akzrsdemo.blob.core.windows.net/
Using OAuth token for authentication.
RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [421]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 17:39:52 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [418ad35a-c01e-0033-54f7-5f6b64000000]
PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" https://akzrsdemo.blob.core.windows.net/newcontainer
Using OAuth token for authentication.
RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [421]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 17:40:52 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [968a0fa4-101e-009e-7ff7-5f7219000000]
PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" https://akzrsdemo.blob.core.windows.net/aktest
Using OAuth token for authentication.
RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [421]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 17:41:15 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [534b0f92-701e-0020-34f7-5fa640000000]
./azcopy cp "___/backup.pst" "https://___.blob.core.windows.net/ingestiondata?sv=___&sr=___&si=___&sig=___&se=___"
Note that the instructions for Windows specify the following command:
AzCopy.exe /Source:<Location of PST files> /Dest:<SAS URL> /V:<Log file location> /Y
with the following comment regarding /Y
:
This required switch allows the use of write-only SAS tokens when you upload the PST files to the Azure storage location. The SAS URL you obtained in step 1 (and specified in /Dest: parameter) is a write-only SAS URL, which is why you must include this switch. Note that a write-only SAS URL will not prevent you from using the Azure Storage Explorer to view a list of the PST files uploaded to the Azure storage location.
I am unsure how to add the equivalent tag here, as I have found no pointers in the documentation or help pages.
Job a0eef242-719d-c04f-4cd7-8765dcf1b232 has started
a0eef242-719d-c04f-4cd7-8765dcf1b232.log file created in /___/.azcopy
0 Done, 1 Failed, 0 Pending, 0 Skipped, 1 Total
In the log file I have the following header:
2018/10/12 03:31:09 AzcopVersion 10.0.2-Preview
2018/10/12 03:31:09 OS-Environment darwin
2018/10/12 03:31:09 OS-Architecture amd64
2018/10/12 03:31:09 Job-Command cp ___/backup.pst https://___.blob.core.windows.net/ingestiondata?se=___&si=___&sig=___&sr=___&sv=___
2018/10/12 03:31:09 JobID=dfb0ba05-f63c-5748-62e2-8b5f5aa5745a, credential type: Anonymous
2018/10/12 03:31:09 scheduling JobID=dfb0ba05-f63c-5748-62e2-8b5f5aa5745a, Part#=0, Transfer#=0, priority=0
2018/10/12 03:31:09 INFO: [P#0-T#0] has worker 213 which is processing TRANSFER
2018/10/12 03:31:09 INFO: [P#0-T#0] Starting transfer: Source "___/backup.pst" Destination "https://___.blob.core.windows.net/ingestiondata/backup.pst?se=___&si=___&sig=___&sr=___&sv=___". Specified chunk size 8388608
Followed by a BUNCH of these that all have the exact same timestamp and "try" number (Try=1):
2018/10/12 03:31:09 ==> OUTGOING REQUEST (Try=1)
PUT https://___.blob.core.windows.net/ingestiondata/backup.pst?blockid=___&comp=___&se=___&si=___&sig=___&sr=___&sv=___&timeout=901
Content-Length: [8388608]
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; darwin)]
X-Ms-Client-Request-Id: [7cb4a66d-c7d6-4b5e-7d91-6557c2855b95]
X-Ms-Version: [2018-03-28]
And then a bunch of these (whitespace may be wonky, as I have trouble copying from vim):
2018/10/12 03:35:27 ==> REQUEST/RESPONSE (Try=1/4m17.34582974s[SLOW >3s], OpTime=4m17.345889972s) -- REQUEST ERROR
PUT https://___.blob.core.windows.net/ingestiondata/backup.pst?blockid=___&comp=___&se=___&si=___&sig=___&sr=___&sv=___&timeout=901
Content-Length: [8388608]
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; darwin)]
X-Ms-Client-Request-Id: [1db006dc-c187-4242-6a08-d290227a7d60]
X-Ms-Version: [2018-03-28]
--------------------------------------------------------------------------------
ERROR:
-> github.com/Azure/azure-storage-azcopy/ste.newAzcopyHTTPClientFactory.func1.1, /go/src/github.com/Azure/azure-storage-azcopy/ste/mgr-JobPartMgr.go:95
HTTP request failed
Put https://___.blob.core.windows.net/ingestiondata/backup.pst?blockid=___&comp=___&se=___&si=___&sig=___&sr=___&sv=___&timeout=901: read tcp 10.254.246.200:52028->52.239.148.74:443: read: connection reset by peer
goroutine 264 [running]:
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.stack(0xc42014e310, 0xc420a52100, 0x0)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_request_log.go:146 +0xa7
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewRequestLogPolicyFactory.func1.1(0x461c020, 0xc4209828a0, 0xc420996800, 0x45b502b, 0xc, 0x45 b4175, 0xa)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_request_log.go:96 +0x665
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc42096c8c0, 0x461c020, 0xc4209828a0, 0xc420996800, 0xa, 0x452fa80, 0xc420044800, 0xc42072f8 d0)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/ste.NewVersionPolicyFactory.func1.1(0x461c020, 0xc4209828a0, 0xc420996800, 0x0, 0xc4209b7118, 0x402bcd4, 0x45d5b90)
/go/src/github.com/Azure/azure-storage-azcopy/ste/mgr-JobPartMgr.go:59 +0xe8
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc4209053a0, 0x461c020, 0xc4209828a0, 0xc420996800, 0xc4209b7128, 0xc4209b71e8, 0x4119083, 0 xc4209828b0)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.responderPolicy.Do(0x46186a0, 0xc4209053a0, 0xc420744d20, 0x461c020, 0xc4209828a0, 0xc42099680 0, 0xc4209828b0, 0x461bfa0, 0xc42072f8c0, 0x0)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_responder_policy.go:33 +0x53
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.anonymousCredentialPolicy.Do(0x46186e0, 0xc4209053c0, 0x461c020, 0xc4209828a0, 0xc420996800, 0 x4913da0, 0x461c020, 0xc4209828a0, 0xc420953660)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_credential_anonymous.go:54 +0x4f
github.com/Azure/azure-storage-azcopy/ste.NewBlobXferRetryPolicyFactory.func1.1(0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x45b9940, 0x16, 0xc4208d39b0, 0x24)
/go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-retrypolicy.go:362 +0x6c1
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc42096c910, 0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x24, 0x41f7870, 0x454f900, 0xc4208ef6b0 )
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewUniqueRequestIDPolicyFactory.func1.1(0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x45b432d, 0xa, 0xc420768180, 0x3b)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_unique_request_id.go:19 +0x9c
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc4209053e0, 0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x3b, 0xc42000e450, 0xc4208ef710, 0xc420 9b7608)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewTelemetryPolicyFactory.func1.1(0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x1, 0x0, 0x1, 0xc420 00e450)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_telemetry.go:34 +0x9e
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc4208ef710, 0x461bfa0, 0xc42072f8c0, 0xc420996700, 0xc4208ef710, 0x45b24df, 0xc4209b7678, 0 x4013318)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.(*pipeline).Do(0xc42072f880, 0x461bfa0, 0xc42072f8c0, 0x4618700, 0xc420744d20, 0xc420996700, 0x2d, 0xc42079 2035, 0x19, 0x0)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:128 +0x81
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.blockBlobClient.StageBlock(0xc420792000, 0x5, 0x0, 0x0, 0x0, 0xc420792008, 0x2d, 0xc420792035, 0x19, 0x0, ...)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_block_blob.go:262 +0x4a9
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.BlockBlobURL.StageBlock(0xc420792000, 0x5, 0x0, 0x0, 0x0, 0xc420792008, 0x2d, 0xc420792035, 0x 19, 0x0, ...)
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/url_block_blob.go:74 +0x131
github.com/Azure/azure-storage-azcopy/ste.(*blockBlobUpload).blockBlobUploadFunc.func1(0xdf)
/go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-localToBlockBlob.go:338 +0x5a1
github.com/Azure/azure-storage-azcopy/ste.(*jobsAdmin).transferAndChunkProcessor(0xc4201340c0, 0xdf)
/go/src/github.com/Azure/azure-storage-azcopy/ste/JobsAdmin.go:216 +0xa6
created by github.com/Azure/azure-storage-azcopy/ste.initJobsAdmin
/go/src/github.com/Azure/azure-storage-azcopy/ste/JobsAdmin.go:160 +0x501
... but yeah, it's a >12MB log file. Is it safe to share all its contents? I can't tell what's sensitive and what's not haha.
azcopyV10-win.exe ls https://myadlsgen2.dfs.core.windows.net/test
invalid path passed for listing. given source is of type 5 while expect is container / container path
10.0.2
Linux 16.04
./azcopy cp "https://cjwstorage.blob.core.windows.net/dltest/" /mnt --recursive
The logs said 'No such file or directory'. I expected it to say 'no permission to write to destination' or something
2018/10/03 20:43:26 ERR: [P#0-T#2096] DOWNLOADFAILED: https://cjwstorage.blob.core.windows.net/dltest/2631.bin?se=2018-10-14t04%3A28%3A12z&sig=REDACTED&sp=rwdlacup&spr=https%2Chttp&srt=sco&ss=bfqt&st=2018-10-02t20%3A28%3A12z&sv=2017-11-09 : 000 : File Creation Error open /mnt/dltest/2631.bin: no such file or directory
Dst: /mnt/dltest/2631.bin
2018/10/03 20:43:26 ERR: [P#0-T#2100] DOWNLOADFAILED: https://cjwstorage.blob.core.windows.net/dltest/8732.bin?se=2018-10-14t04%3A28%3A12z&sig=REDACTED&sp=rwdlacup&spr=https%2Chttp&srt=sco&ss=bfqt&st=2018-10-02t20%3A28%3A12z&sv=2017-11-09 : 000 : File Creation Error open /mnt/dltest/8732.bin: no such file or directory
Dst: /mnt/dltest/8732.bin
2018/10/03 20:43:26 ERR: [P#0-T#2100] /mnt/dltest/8732.bin: 0: Delete File Error -remove /mnt/dltest/8732.bin: no such file or directory
2018/10/03 20:43:26 JobID=8ee131d6-d92a-494e-4faa-9230f365db52, Part#=0, TransfersDone=2053 of 10000
2018/10/03 20:43:26 INFO: [P#0-T#2114] has worker 104 which is processing TRANSFER
2018/10/03 20:43:26 ERR: [P#0-T#2114] DOWNLOADFAILED: https://cjwstorage.blob.core.windows.net/dltest/8403.bin?se=2018-10-14t04%3A28%3A12z&sig=REDACTED&sp=rwdlacup&spr=https%2Chttp&srt=sco&ss=bfqt&st=2018-10-02t20%3A28%3A12z&sv=2017-11-09 : 000 : File Creation Error open /mnt/dltest/8403.bin: no such file or directory
Dst: /mnt/dltest/8403.bin
I didn't have access to the /mnt folder. After chown, the command worked fine
using a policy in SAS keys will generate a 403 error, while it works when building a SAS key without using a policy.
AzCopy 10.0.4-Preview on Ubuntu 14.04
Command:
azcopy sync 'https://production8858.blob.core.windows.net/wad-iis-logfiles?se=2030-01-01&sp=rl&sv=2018-03-28&sr=c&sig=REDACTED' /backup --recursive
SAS was generated using:
az storage container generate-sas \
--account-name production8858 \
--account-key 'redacted' \
--name wad-iis-logfiles \
--permissions rl \
--expiry 2030-01-01
The "se" value is definitely included, but azcopy fails.
Result:
Failed with error error starting the sync between source https://production8858.blob.core.windows.net/wad-iis-logfiles and destination /backup.
Failed with error cannot list blobs for download.
Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:c18abdb2-801e-0049-2f14-7c2fba000000
Time:2018-11-14T12:19:11.8402297Z, Details:
AuthenticationErrorDetail: se is mandatory. Cannot be empty
Code: AuthenticationFailed
GET https://production8858.blob.core.windows.net/wad-iis-logfiles?comp=list&restype=container&sig=REDACTED&sp=rl&sr=c&sv=2018-03-28&timeout=901
User-Agent: [AzCopy/10.0.4-Preview Azure-Storage/0.3 (go1.10.3; linux)]
X-Ms-Client-Request-Id: [41df3c4f-57ec-4ac7-7e35-7af1f4d305f0]
X-Ms-Version: [2018-03-28]
--------------------------------------------------------------------------------
RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Access-Control-Allow-Origin: [*]
Access-Control-Expose-Headers: [Access-Control-Allow-Origin]
Content-Length: [407]
Content-Type: [application/xml]
Date: [Wed, 14 Nov 2018 12:19:11 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [c18abdb2-801e-0049-2f14-7c2fba000000]
Currently, we only see the throughput displayed when we download a file from a blob.
If we add the option --output=json, it adds a bit more information but that is not readable from an end user perspective.
We would need to have the size of the file being downloaded, the time remaining, the size remaining, etc.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.