Giter VIP home page Giter VIP logo

gcs-resumable-upload's People

Contributors

alexander-fenster avatar avaksman avatar bcoe avatar callmehiphop avatar danielbankhead avatar danielduhh avatar ddelgrosso1 avatar dpebot avatar fhinkel avatar gcf-merge-on-green[bot] avatar gcf-owl-bot[bot] avatar gcochard avatar google-cloud-policy-bot[bot] avatar jiren avatar jkwlui avatar justinbeckwith avatar laljikanjareeya avatar ofrobots avatar praveenqlogic01 avatar release-please[bot] avatar renovate-bot avatar renovate[bot] avatar shaffeeullah avatar sofisl avatar stephenplusplus avatar summer-ji-eng avatar surferjeffatgoogle avatar yoshi-automation avatar zbjornson avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gcs-resumable-upload's Issues

Streams, pumps, corks, OH MY!

Ohai! ๐Ÿ‘‹ I'm learning a lot about streams while using your libraries. I have a few questions about why things work the way they do, and figured this was as good a medium as any to ask :)

Duplex Streams

I was looking at pumpify, and trying to figure out it's place in this library. I see that this code is really a bit of magic:

this.setPipeline(bufferStream, offsetStream, requestStream, delayStream);

As far as I understand, this means the upload() method itself returns an object that is a Duplex stream and lets data flow these pipes. It got me wondering though....

Why is upload() a duplex stream? As far as I can tell - this is only something I'd expect to write to. I wouldn't expect to read from the stream. This is causing some heartburn as I try to move from request to axios, and I just want to better understand what's the what :)

Delay Stream

Very much related to the previous question, I noticed that the requestStream is piping into the delayStream. The delayStream seems to be a simple transform that just does this:

// wait for "complete" from request before letting the stream finish
delayStream.on('prefinish', () => {
  this.cork();
});

From reading the docs, I now have a reasonable understanding of what cork and uncork do. My question is.... why?

Why do we need to explicitly cork and uncork the delayStream? I would expect the requestStream to handle sending the finish or end events, and standard pipe() semantics to do what one would expect. That having been said - a streams expert I ain't :)

I'm really just trying to piece this al together. Any thoughts would be appreciated :) Thanks!

Allow passing in query parameters when creating the upload URI

The GCS API supports a number of query parameters when creating a resumable upload URI (https://cloud.google.com/storage/docs/json_api/v1/objects/insert#request). The most valuable one for my use case is the ability to set one of the predefined ACL strings -- I have not been able to find a way to pass this parameter as part of the metadata the request body.

It would be great if this module (and by extension, gcloud) supported a way to pass these in. My suggestion would either be to add a new queryParameters object to the config, or to piggyback onto the existing cfg.metadata object -- if cfg.metadata.queryParameters exists, extract the parameters from it.

The first has the advantage of being cleaner, while the second has the advantage of working out-of-the-box with the existing gcloud code, since the metadata object is passed to this module as-is.

If you consider this a valuable addition to the module, I'd be happy to submit a pull request for whichever method makes more sense to you.

Chunks get lost when resuming upload with different data.

Pretty unfamiliar with this module, second pair of eyes would be good here!

It looks like we compare chunks to see if the user decided to resume the upload with different data
https://github.com/googleapis/gcs-resumable-upload/blob/master/src/index.ts#L347

If that happens, we put the chunk back in the bufferStream, buuut, once we restart we just toss that stream away
https://github.com/googleapis/gcs-resumable-upload/blob/master/src/index.ts#L294

I believe this would result in lost chunks, maybe?

Add support for resuming interrupted uploads

The JSON api supports resuming an interrupted upload. It would be great to extend this library to support uploading file chunks that aren't the full file size.

The Content-Range header can be constructed to allow smaller file chunks:

  // Default to 0, the start byte, if offset isn't provided
  cfg.offset = parseInt(cfg.offset, 10);
  this.offset = isNaN(cfg.offset) ? 0 : cfg.offset;

  // Default to * if chunkSize isn't provided
  cfg.chunkSize = parseInt(cfg.chunkSize, 10);
  this.chunkSize = isNaN(cfg.chunkSize) ? '*' : cfg.chunkSize;

  // Default to * if entire file size isn't provided
  var contentLength = 'contentLength' in cfg.metadata ? parseInt(cfg.metadata.contentLength, 10) : false;
  this.contentLength = isNaN(contentLength) ? '*' : contentLength;

  var reqOpts = {
    method: 'PUT',
    uri: this.uri,
    headers: {
      'Content-Range': 'bytes ' + this.offset + '-' + this.chunkSize + '/' + this.contentLength
    }
  }

I'm happy to take a stab at it if you're cool with this strategy, and if this is the best route to get support into gcloud-node.

Stuck on `Error: Gone`

I'm deploying an Ember application to a GCS using the ember-cli-deploy-gcloud-storage and ember-cli-deploy-gcs-index addons.

Everything was working correctly when I used a GC service account with Project Owner permissions, but this seemed like overkill so I was experimenting trying different service accounts with lesser permissions. Somewhere along the line, deploying the ember app went from returning Error: Unauthorized to Error: Gone. Now no matter what I try, the bucket seems to be stuck in the state of trying to resume a failed upload. Below is the console output from running ember deploy.

I've tried:

  1. Uploading directly through the GC Storage UI the files I thought were missing. The goal being to trick the bucket into thinking everything was uploaded.
  2. Deleting everything from the bucket.

Neither of these worked, so I've resorted to destroying the bucket completely and starting fresh.

My question: Is there a way to clear the partially uploaded file cache on a bucket (or clear wherever it keeps track of uploads) without destroying the entire bucket?

- Error: Gone
- Error: Gone
    at Request._callback (/Users/uname/Code/union/portal/node_modules/ember-cli-deploy-gcloud-storage/node_modules/google-cloud/node_modules/@google-cloud/storage/node_modules/gcs-resumable-upload/index.js:274:25)
    at Request.self.callback (/Users/uname/Code/union/portal/node_modules/request/request.js:187:22)
    at emitTwo (events.js:100:13)
    at Request.emit (events.js:185:7)
    at Request.<anonymous> (/Users/uname/Code/union/portal/node_modules/request/request.js:1044:10)
    at emitOne (events.js:90:13)
    at Request.emit (events.js:182:7)
    at IncomingMessage.<anonymous> (/Users/uname/Code/union/portal/node_modules/request/request.js:965:12)
    at emitNone (events.js:85:20)
    at IncomingMessage.emit (events.js:179:7)

generation == 0 not respected

The fix is easy, but I'm not familiar enough with the code base or javascript to be able to easily add a test case.

In Upload, change

if (this.generation) {
reqOpts.qs.ifGenerationMatch = this.generation
}

to

if (is.defined(this.generation)) {
reqOpts.qs.ifGenerationMatch = this.generation
}

Cannot abort requests in Node < 8

While investigating googleapis/nodejs-storage#619, Node 6, 8, and 10 have different reactions to the same Storage system test:

Node 10: Good code, guys!
Node 8: Upload Client Broken Connection
Node 6: Cancellation of streamed requests with AbortSignal is not supported in node < 8

That last error is thrown by fetch: https://github.com/bitinn/node-fetch/blob/e996bdab73baf996cf2dbf25643c8fe2698c3249/src/request.js#L205

I'm not sure of a way around this, exactly. We're cancelling an active HTTP request in the event of an error. Fetch is the boss on how requests made with it can be cancelled, but they're shutting out Node 6, evidently. :\

Uploads failing

Just today, I started receiving this error while attempting to upload:

this.configStore.del is not a function
at Upload.deleteConfig

It's never happened before. NO idea what it's all about.

Progress event

Is it possible to hook up a progress event somewhere on the OnChunk method so we can have something similar to what Evaporate does for AWS?

Time for a release!

So I've been monkeying around in here quite a bit. I'd love to do a few smoke tests, and then get a release out there before I start doing even worse things.

Are there a set of libraries / tests I should check out before we cut a new release?

@stephenplusplus @ofrobots

Allow setting an origin

@janesconference - any interest in adding support for any options we're currently missing?

Add support for non authenticated users using signed URL

I want to be able to do resumable uploads to Google Cloud Storage in a node.js client application using this package, in conjunction with signed urls (since the client app is invoked by unauthenticated users).

My server generates a signed url and then sends a POST to the signed url with header { 'x-goog-resumable': 'start' } and an empty body, and receives a response with a location header that looks something like the following:

https://storage.googleapis.com/<bucket_name/<file_path>?GoogleAccessId=<service_account>&Expires=<expiry_time>&Signature=<signature>&upload_id=<upload_id>

If I return the above location header to my client, I would like for the client to be able to perform a resumable upload using this package, without further authentication.

I was not able to figure out how to do so using this package. If it is possible, it would be great if an example could be added in the samples.

Thanks!

Synthesis failed for gcs-resumable-upload

Hello! Autosynth couldn't regenerate gcs-resumable-upload. ๐Ÿ’”

Here's the output from running synth.py:

Cloning into 'working_repo'...
Switched to branch 'autosynth'
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 256, in <module>
    main()
  File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 196, in main
    last_synth_commit_hash = get_last_metadata_commit(args.metadata_path)
  File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 149, in get_last_metadata_commit
    text=True,
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 403, in run
    with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'text'

Google internal developers can see the full log here.

Error: Invalid request. The number of bytes uploaded is required to be equal or greater than 262144...

Error: Invalid request. The number of bytes uploaded is required to be equal or greater than 262144, except for the final request (it's recommended to be the exact multiple of 262144). The received request contained 16384 bytes, which does not meet this requirement.

(See here for the error trace: http://pastebin.com/PhukYCUq)

I'm trying to resumably upload a couple of videos to my google cloud storage bucket. So far I've tested the resumable upload code from this package with three videos. The first video is ~16 MB, the second ~15 MB, and the final, ~150 MB. The error above occurs for the first video, and a similar error occurs for the second video, except that instead of saying the request is 16384 Bytes, the request is apparently 81920 Bytes. The error does not occur for the third video file, presumably because it passes the threshold.

I'm not sure why this error is occurring, and while I did see the solution posted here: googleapis/google-cloud-node#1044, I can't find the files that were deleted to fix the code functionality. I am assuming this issue was fixed for the current version of the package.

My upload code is almost identical to what is shown in the readme for this package, except that I included the external code for automatic authentication (which is also almost identical).

I thought it might be useful to post this error for the benefit of everyone. Any help would be much appreciated!

Merge Next => Master, cut a major release

So @stephenplusplus I dug to the bottom of what's happening with nodejs-storage and nodejs-common@18. The issue is a mismatch of authClient objects being passed between nodejs-common and gcs-resumable-upload. gcs-resumable-upload is still using google-auto-auth, while common is using google-auth-library.

This is a fix I've already made in the next branch. The bad news is that that change also comes with a slight breaking change to the API due to the es6 module upgrade.

What do you think of merging next into master, and cutting a semver major release?

@ofrobots curious about your thoughts here too.

configstore caused uploads to fail

Unfortunately I don't have any useful logs or error messages, but when using gcs-resumable-upload by way of gcloud-node I kept getting a "Not found" error when running as one user, but not as another user on the same machine. Without fully understanding what configstore is there for, I went ahead and deleted the ~/.config/configstore/gcs-resumable-upload.json file, and suddenly uploads started working again! I wish I had looked at the contents of the configstore file before I did it, but my guess is that some invalid config was saved and needed to be rejiggered for anything with gcs-resumable-upload to work.

I think it would be good if there was some sanity check before doing anything with the contents of that file. I have a cron job running every minute overwriting the same file over and over again, so there was probably some old invalid data in configstore that kept getting reused over and over because the job always uses the same filename.

Resumable upload with chunks

Hi,

I'm trying to use this package on my API that receives chunks of file and then send them on GCS to reconstitute the whole file, however I'm getting stuck.

The first chunk get uploaded correctly, GCS return a response 308 as it's supposed to do (the file is smaller than the contentLength specified) which make gcs-resumable-upload throw an error. Is it normal ? I think it shouldn't throw as it's a normal behaviour for an incomplete but resumable upload.

I change to not reject my promise on error (to avoid this thing with code 308) but then the upload gets stuck on the second chunk. I get the logs of progress until a certain point where it get stucks and nothing happens. The request is retried until the point where GCS return a request timeout. I tried on several networks but it's always the same.

Below is the code I use for upload. Any clue would be greatly appreciated :)

const upConfig = {
    bucket: config.bucketName,
    authClient,
    file: filename,
    offset,  //Offset in bytes 
    metadata: { contentLength }, // ContentLength of the whole file, not the chunk
    public: true,
  };

return new Promise((resolve, reject) => {
    readableStream
      .pipe(upload(upConfig))
      .on('progress', progress => {
        console.log('Progress', progress);
      })
      .on('response', res => {
        console.log('GCS Response', res);
      })
      .on('finish', () => {
        console.log('Done');
        resolve();
      })
      .on('error', function(err) {
        console.error('Upload chunk error', err);
        resolve(); // don't want to reject on status code 308
      });
  });

Synthesis failed for gcs-resumable-upload

Hello! Autosynth couldn't regenerate gcs-resumable-upload. ๐Ÿ’”

Here's the output from running synth.py:

usage: synth.py [-h] [--github-user GITHUB_USER] [--github-email GITHUB_EMAIL]
                [--github-token GITHUB_TOKEN] --repository REPOSITORY
                [--synth-path SYNTH_PATH] [--metadata-path METADATA_PATH]
                [--deprecated-execution] [--branch-suffix BRANCH_SUFFIX]
                [--pr-title PR_TITLE]
                ...
synth.py: error: the following arguments are required: --repository

Google internal developers can see the full log here.

upload is not a function

i'm sorry if my question is so stupid
i've tried to use this module. It worked on power shell but when i created the .js file like your example, it didn't work and returned that "upload is not a function"
`
var upload = require('gcs-resumable-upload');
var fs = require('fs');

fs.createReadStream('titanic.mov')
.pipe(upload({ bucket: 'upload-bucket208', file: 'titanic.mov' }))
.on('finish', function () {
// Uploaded!
});
`

.pipe(upload({ bucket: 'upload-bucket208', file: 'titanic.mov' }))          ^TypeError: upload is not a function   at Object.<anonymous> (D:\xampp\htdocs\gcs-resumable-upload\test.js:5:11)
at Module._compile (module.js:652:30)
at Object.Module._extensions..js (module.js:663:10)
at Module.load (module.js:565:32)
at tryModuleLoad (module.js:505:12)
at Function.Module._load (module.js:497:3)
at Function.Module.runMain (module.js:693:10)
at startup (bootstrap_node.js:191:16)
at bootstrap_node.js:612:3

Synthesis failed for gcs-resumable-upload

Hello! Autosynth couldn't regenerate gcs-resumable-upload. ๐Ÿ’”

Here's the output from running synth.py:

2020-05-15 10:26:18,874 autosynth [INFO] > logs will be written to: /usr/local/google/home/rennie/gitrepos/synthtool/logs/googleapis/gcs-resumable-upload
2020-05-15 10:26:21,033 autosynth [DEBUG] > Running: git config --global core.excludesfile /usr/local/google/home/rennie/.autosynth-gitignore
2020-05-15 10:26:21,037 autosynth [DEBUG] > Running: git config user.name Jeffrey Rennie
2020-05-15 10:26:21,042 autosynth [DEBUG] > Running: git config user.email [email protected]
2020-05-15 10:26:21,048 autosynth [DEBUG] > Running: git config push.default simple
2020-05-15 10:26:21,053 autosynth [DEBUG] > Running: git branch -f autosynth
2020-05-15 10:26:21,060 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-05-15 10:26:21,473 autosynth [DEBUG] > Running: git rev-parse --show-toplevel
2020-05-15 10:26:21,483 autosynth [DEBUG] > Running: git log -1 --pretty=%H
2020-05-15 10:26:21,491 autosynth [DEBUG] > Running: git remote get-url origin
2020-05-15 10:26:22,003 autosynth [DEBUG] > Running: git log be74d3e532faa47eb59f1a0eaebde0860d1d8ab4..HEAD --pretty=%H --no-decorate
2020-05-15 10:26:22,009 autosynth [DEBUG] > Running: git log -1 --pretty=%at be74d3e532faa47eb59f1a0eaebde0860d1d8ab4
2020-05-15 10:26:22,017 autosynth [DEBUG] > Running: git log -1 --pretty=%at 4674113712c0c7ada19e6a8219d7963ff174b392
2020-05-15 10:26:22,026 autosynth [DEBUG] > Running: git log -1 --pretty=%at 5bbfd095faedfe273819d266f21e402192a29041
2020-05-15 10:26:22,034 autosynth [DEBUG] > Running: git log -1 --pretty=%at 4fa923bd3dafb91df8613accbe2230299cc5b98e
2020-05-15 10:26:22,038 autosynth [DEBUG] > Running: git log -1 --pretty=%at 55cdc844877d97139f25004229842624a6a86a02
2020-05-15 10:26:22,045 autosynth [DEBUG] > Running: git log -1 --pretty=%at 98c50772ec23295c64cf0d2ddf199ea52961fd4c
2020-05-15 10:26:22,050 autosynth [DEBUG] > Running: git log -1 --pretty=%at ba909fca409f6b38eae0fa735614e127d1fc0deb
2020-05-15 10:26:22,057 autosynth [DEBUG] > Running: git log -1 --pretty=%at 7482e79a82e353248769d819788adc1213e8c207
2020-05-15 10:26:22,064 autosynth [DEBUG] > Running: git log -1 --pretty=%at a7759f81c25396207d46532ed389ad4d34879857
2020-05-15 10:26:22,073 autosynth [DEBUG] > Running: git log -1 --pretty=%at 5b48b0716a36ca069db3038da7e205c87a22ed19
2020-05-15 10:26:22,079 autosynth [DEBUG] > Running: git log -1 --pretty=%at c585ac3b5eff5cd2097a5315ffd9cf4823cc1ed2
2020-05-15 10:26:22,087 autosynth [DEBUG] > Running: git log -1 --pretty=%at b0461724be19443075b08c10d4a345cb217002b5
2020-05-15 10:26:22,095 autosynth [DEBUG] > Running: git log -1 --pretty=%at 84c4156c49be9dcabacc8fd7b0585b6fd789ae47
2020-05-15 10:26:22,101 autosynth [DEBUG] > Running: git log -1 --pretty=%at f503622985e230a6792730bbc3b7746c11fce09e
2020-05-15 10:26:22,107 autosynth [DEBUG] > Running: git log -1 --pretty=%at 3d2a7d0e21387ed455c966da9f9897b0a4bc5bb8
2020-05-15 10:26:22,113 autosynth [DEBUG] > Running: git log -1 --pretty=%at 7b7f386b393947a542b87707499f4458136f4f61
2020-05-15 10:26:22,119 autosynth [DEBUG] > Running: git log -1 --pretty=%at f395615039665af6599f69305efcd886685e74f9
2020-05-15 10:26:22,126 autosynth [DEBUG] > Running: git log -1 --pretty=%at b6bdd4783f396f9252ce28af43f7215834a55c3c
2020-05-15 10:26:22,134 autosynth [DEBUG] > Running: git log -1 --pretty=%at 3593e3a995510c0570648d9a48fc756ab2bfc2cb
2020-05-15 10:26:22,142 autosynth [DEBUG] > Running: git log -1 --pretty=%at cb3433f7f554ea751584bdd3631d45ec56a32eb5
2020-05-15 10:26:22,151 autosynth [DEBUG] > Running: git checkout 18f5251f553b22c2c301eb198bd758fc25e49937
Note: switching to '18f5251f553b22c2c301eb198bd758fc25e49937'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:

  git switch -c <new-branch-name>

Or undo this operation with:

  git switch -

Turn off this advice by setting config variable advice.detachedHead to false

HEAD is now at 18f5251 chore: release 3.0.0 (#334)
2020-05-15 10:26:22,160 autosynth [DEBUG] > Running: git checkout cb3433f7f554ea751584bdd3631d45ec56a32eb5
Note: switching to 'cb3433f7f554ea751584bdd3631d45ec56a32eb5'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:

  git switch -c <new-branch-name>

Or undo this operation with:

  git switch -

Turn off this advice by setting config variable advice.detachedHead to false

HEAD is now at cb3433f fix: uses http for links to internal logs (#556)
2020-05-15 10:26:22,173 autosynth [DEBUG] > Running: git branch -f autosynth-20
2020-05-15 10:26:22,180 autosynth [DEBUG] > Running: git checkout autosynth-20
Switched to branch 'autosynth-20'
2020-05-15 10:26:22,188 autosynth [INFO] > Running synthtool
2020-05-15 10:26:22,188 autosynth [INFO] > ['/usr/local/google/home/rennie/env3.6/bin/python', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']
2020-05-15 10:26:22,192 autosynth [DEBUG] > Running: /usr/local/google/home/rennie/env3.6/bin/python -m synthtool --metadata synth.metadata synth.py --
/usr/local/google/home/rennie/env3.6/bin/python: No module named synthtool
2020-05-15 10:26:22,226 autosynth [ERROR] > Synthesis failed
2020-05-15 10:26:22,226 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 18f5251 chore: release 3.0.0 (#334)
2020-05-15 10:26:22,233 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-05-15 10:26:22,240 autosynth [DEBUG] > Running: git clean -fdx
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/local/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synth.py", line 612, in <module>
    main()
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synth.py", line 473, in main
    return _inner_main(temp_dir)
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synth.py", line 592, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synth.py", line 368, in synthesize_loop
    synthesize_inner_loop(toolbox, synthesizer)
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synth.py", line 378, in synthesize_inner_loop
    synthesizer, len(toolbox.versions) - 1
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/usr/local/google/home/rennie/gitrepos/synthtool/autosynth/synthesizer.py", line 119, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/usr/local/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/usr/local/google/home/rennie/env3.6/bin/python', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Creating a resumable upload from chunks

Hi @JustinBeckwith ,

I'm starting being nuts!!
After this comment you might have read, I have been redirected to your Git repo but don't get how to use gcs-resumable-uplaod with bytes when you don't have direct access to the file!

Here's the use case:

  • I'm sending files(images) as base64 bytes from my mobile app to one of my api endpoint (hosted on Firebase Hosting)
  • Because Firebase requests are limited to 10MB, if the file size is less than 10MB, I send the entire file bytes and do a regular upload using file.save()
const options = {
    contentType: "image/jpeg",
    offset: 0,
    resumable: false,
};
      
const buffer = new Buffer(req.body.media_data, 'base64');
      
return file.save(buffer, options, async (error, success) => {
    if (!error) {
        return res.status(200).json({
            status: 200,
            data: success
         }).send();
    } else {
         return res.status(500).json({
            status: -1,
            data: `Error while saving data: ${error}`
         }).send();
    }
});

Until this point, everything is okay and I'm able to upload "small" files to my GCS

  • If the file size is larger than 10MB, my mobile app "cuts" the file to send chunks of maximum 10MB.

  • When I retrieve a chunk on the api endpoint, I tried two methods:

      • using file.save() as the documentation says "Resumable uploads are automatically enabled and must be shut off explicitly by setting options.resumable to false" I then use the same method as above with options.resumable=true until I receive the last chunk from the mobile app.
      • using file.createResumableStream() + file.createWriteStream(): createWriteStream will then receive an additional options.uri from createResumableStream.
  • With both methods I get errors messages:

      • first method returns Resumable uploads require write access to the $HOME directory. Through config-store, some metadata is stored. By default, if the directory is not writable, we will fall back to a simple upload. However, if you explicitly request a resumable upload, and we cannot write to the config directory, we will return a ResumableUploadError.
      • second method returns Error: Upload failed

What is wrong? I wanted to try your gcs-resumable-upload but how to use it with chunks? Because I don't have direct access to the file, bytes are coming from my mobile app.

v0.14.0 is not on npm (faulty publish?)

Hi! According to the git history and changelog in the repo, the latest version of 0.14.0 was released earlier today.

There is a link in the changelog to the version history page on npm, but that page does not have an entry for 0.14.0. When I npm install the package, it installs version 0.13.0. I've tried manually specifying [email protected], but npm is unable to find that version. The npm API also does not have an entry for 0.14.0.

Is it possible something went wrong during the publish?

Synthesis failed for gcs-resumable-upload

Hello! Autosynth couldn't regenerate gcs-resumable-upload. ๐Ÿ’”

Here's the output from running synth.py:

me=publish.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/release/publish.cfg', wd=29, mask=IN_ATTRIB, cookie=0, name=publish.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/release/docs.cfg', wd=29, mask=IN_MODIFY, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/release/docs.cfg', wd=29, mask=IN_ATTRIB, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/release/docs.cfg', wd=29, mask=IN_ATTRIB, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/samples-test.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=samples-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/samples-test.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=samples-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/samples-test.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=samples-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/lint.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=lint.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/lint.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=lint.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/lint.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=lint.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/test.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/test.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/test.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/common.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=common.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/common.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=common.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/common.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=common.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/common.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=common.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/system-test.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=system-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/system-test.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=system-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/system-test.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=system-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/system-test.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=system-test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/docs.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/docs.cfg', wd=34, mask=IN_MODIFY, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/docs.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node10/docs.cfg', wd=34, mask=IN_ATTRIB, cookie=0, name=docs.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/test.cfg', wd=36, mask=IN_MODIFY, cookie=0, name=test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/test.cfg', wd=36, mask=IN_ATTRIB, cookie=0, name=test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/test.cfg', wd=36, mask=IN_ATTRIB, cookie=0, name=test.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/common.cfg', wd=36, mask=IN_MODIFY, cookie=0, name=common.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/common.cfg', wd=36, mask=IN_MODIFY, cookie=0, name=common.cfg>
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/common.cfg', wd=36, mask=IN_ATTRIB, cookie=0, name=common.cfg>
2020-08-06 04:09:17,040 synthtool [DEBUG] > Installing dependencies...
DEBUG:synthtool:Installing dependencies...
DEBUG:watchdog.observers.inotify_buffer:in-event <InotifyEvent: src_path=b'./.kokoro/continuous/node12/common.cfg', wd=36, mask=IN_ATTRIB, cookie=0, name=common.cfg>
npm ERR! code E404
npm ERR! 404 Not Found - GET https://registry.npmjs.org/@compodoc%2fcompodoc - Not found
npm ERR! 404 
npm ERR! 404  '@compodoc/compodoc@^1.1.7' is not in the npm registry.
npm ERR! 404 You should bug the author to publish it (or use the name yourself!)
npm ERR! 404 It was specified as a dependency of 'gcs-resumable-upload'
npm ERR! 404 
npm ERR! 404 Note that you can also install from a
npm ERR! 404 tarball, folder, http url, or git url.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/kbuilder/.npm/_logs/2020-08-06T11_09_21_603Z-debug.log
2020-08-06 04:09:21,617 synthtool [ERROR] > Failed executing npm install:

None
ERROR:synthtool:Failed executing npm install:

None
2020-08-06 04:09:21,637 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/gcs-resumable-upload/synth.py", line 13, in <module>
    node.install()
  File "/tmpfs/src/github/synthtool/synthtool/languages/node.py", line 167, in install
    shell.run(["npm", "install"], hide_output=hide_output)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['npm', 'install']' returned non-zero exit status 1.
2020-08-06 04:09:21,685 autosynth [ERROR] > Synthesis failed
2020-08-06 04:09:21,685 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at a91a1dc chore(node): fix kokoro build path for cloud-rad (#374)
2020-08-06 04:09:21,692 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-08-06 04:09:21,698 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Error configStore.del is not defined

I got this error while trying to upload a file to GCS

/pipeline/source/node_modules/gcs-resumable-upload/index.js:326
  this.configStore.del(this.file)
                   ^

TypeError: this.configStore.del is not a function
    at Upload.deleteConfig (/pipeline/source/node_modules/gcs-resumable-upload/index.js:326:20)
    at Request.<anonymous> (/pipeline/source/node_modules/gcs-resumable-upload/index.js:172:12)
    at emitTwo (events.js:111:20)
    at Request.emit (events.js:191:7)
    at Request.<anonymous> (/pipeline/source/node_modules/request/request.js:1171:10)
    at emitOne (events.js:101:20)
    at Request.emit (events.js:188:7)
    at IncomingMessage.<anonymous> (/pipeline/source/node_modules/request/request.js:1091:12)
    at IncomingMessage.g (events.js:291:16)
    at emitNone (events.js:91:20)
    at IncomingMessage.emit (events.js:185:7)
    at endReadableNT (_stream_readable.js:974:12)
    at _combinedTickCallback (internal/process/next_tick.js:74:11)
    at process._tickCallback (internal/process/next_tick.js:98:9)

Option to not rely on the local file system?

Is your feature request related to a problem? Please describe.

I'm confused as to the reasoning behind the use of configstore here. To me, it does not seem like the correct tool for the job. You're not persisting configuration. You're storing temporary data about an upload in progress.

I guess this might not be a big deal by itself, but the thing is configstore relies on the local file system, and does not seem to have support for using a different storage mechanism. This is a problem, because production node apps typically have more than one instance behind a load balancer-- each running it it's own container with its own local file system. This will of course cause an upload to start over completely, if a different instance serves the resumed upload than the one that was used to start it in the first place. It seems it will also cause the data in the original configstore for the upload to become orphaned, never being cleared out as the original instance is never informed that the upload has completed.

Describe the solution you'd like
It would be nice to be able to store this information in something more friendly to typical production node architectures. Something like Redis, or even just temporary files in GCS itself. With the number of people downloading this, I have a hard time believing that there isn't some kind of solution for this problem, so perhaps I'm just missing something.

Describe alternatives you've considered

Again, unless I'm missing something, it doesn't seem like any alternatives are supported. There isn't really another library like this one, short of rolling my own against the base cloud storage node api.

Error when calling createURI

When I call createURI, I get the following error in my callback. It's pretty cryptic and I have no idea what's causing it.

    upload.createURI({
        bucket: 'mybucket',
        file: 'myfile.jpg',
        authConfig: {
            projectId: 'XXXXXX',
            keyFilename: 'XXXXXX'
        }
    }, function(err, uri) {
        console.log(err, uri);
    });
{ errors: [ { domain: 'global', reason: 'required', message: 'Required' } ],
  code: 400,
  message: 'Required' }

Should uploading not use async internally instead of sync?

I just ran my nodejs express server with the --trace-sync-io flag that prints synchronous calls and I see tons of these when uploading a file through Firebase Storage:

(node:10320) WARNING: Detected use of sync API
    at openSync (fs.js:437:26)
    at readFileSync (fs.js:342:35)
    at get all (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/node_modules/configstore/index.js:34:25)
    at set (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/node_modules/configstore/index.js:81:23)
    at set (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/build/src/index.js:352:26)
    at onChunk (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/build/src/index.js:237:22)
    at Transform._read (_stream_transform.js:189:10)
    at Transform._write (_stream_transform.js:177:12)
    at doWrite (_stream_writable.js:431:12)
(node:10320) WARNING: Detected use of sync API
    at tryStatSync (fs.js:304:25)
    at readFileSync (fs.js:344:17)
    at get all (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/node_modules/configstore/index.js:34:25)
    at set (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/node_modules/configstore/index.js:81:23)
    at set (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/build/src/index.js:352:26)
    at onChunk (/home/alpha/workspace/rekordcloud-api/node_modules/gcs-resumable-upload/build/src/index.js:237:22)
    at Transform._read (_stream_transform.js:189:10)
    at Transform._write (_stream_transform.js:177:12)
    at doWrite (_stream_writable.js:431:12)

Should these calls not be async instead? Is there a reason that they are all sync?

Get the build passing

Right now it looks like the tests are broken. I'm not really familiar enough with the code base to see what's going on, but it would be rad if we could get them passing ๐Ÿ‘

Broken builds

All older builds of gcloud-node are broken cause of latest push here. Check with installing latest build. missing gcs-resumable-upload/cli.js

Write d.ts for pumpify

We rely on pumpify round these parts. It would be awesome if we could submit a PR to DefinitelyTyped to add a d.ts for our benefit, and the masses :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.