Giter VIP home page Giter VIP logo

Comments (11)

renchap avatar renchap commented on June 19, 2024

You can see an example here : https://github.com/renchap/shrine-google_cloud_storage/blob/master/test/gcs_test.rb#L139

GCS requires you to specify an issuer and a signing key when generating a presign URL, and those are different than your Google Cloud API key. So supporting the expires argument in url() means also adding the issuer and signing_key args, and this start to ressemble a lot to the presign method.

@janko-m any advice here on which API would be best to keep it coherent with other Shrine storages ?

from shrine-google_cloud_storage.

rosskevin avatar rosskevin commented on June 19, 2024

So the current usage for an existing file is:

uploaded_file = agreement.document
presigned_url = uploaded_file.storage().presign(uploaded_file.id(), options)

It seems like we might want #url to delegate to #presign based on the presence of :issuer, so instead it would look like S3's implementation and become

agreement.document.url(options)

If you are good with that I can PR, unless you want to do it.

from shrine-google_cloud_storage.

rosskevin avatar rosskevin commented on June 19, 2024

In my case I run within GKE (google container engine), so based on the API doc it looks like I don't need to provide any credentials because of default discovery (I think).

Generating a URL requires service account credentials, either by connecting with a service account when calling Google::Cloud.storage, or by passing in the service account issuer and signing_key values.

@renchap Am I missing something here?

That would also make the desire for #url to #presign behavior based on presence of :expires? or something else like public from the S3 api.

from shrine-google_cloud_storage.

rosskevin avatar rosskevin commented on June 19, 2024

I notice now that you are not using gem 'google-cloud-storage', but gem 'google-api-client', which might change things regarding 1.6.0 and discovery of credentials. Any reason you went that direction? It looks to me that the google-api-client is no longer the prefered gem and is in "maintenance mode".

I'm going to work on a conversion to google-cloud-storage and I'll submit a PR.

from shrine-google_cloud_storage.

janko avatar janko commented on June 19, 2024

@renchap Shrine doesn't have any convention for expiring URLs, :expires_in option in Storage::S3#url is forwarded directly to the aws-sdk-s3 gem. The reason why the separate #presign method exists is because expiring URLs generally don't have to be equal to URLs for direct uploads.

In GCS case that seems to be true, so it seems it might be a good idea for GoogleCloudStorage#url to support an :expires option, and in that case call #presign. Alternatively things could be simpler with the google-cloud-storage gem if it's feasible now. FYI, I noticed a PR was merged some time ago which adds support for uploading generic IO objects (in addition to files).

from shrine-google_cloud_storage.

rosskevin avatar rosskevin commented on June 19, 2024

@janko-m the only thing I see missing at this point in 1.6 is batch delete. How important is it to retain batching here?

It appears that batching will not be supported: googleapis/google-cloud-ruby#1008

... batch is considered a legacy feature. It is not supported in new gRPC-native APIs.

from shrine-google_cloud_storage.

janko avatar janko commented on June 19, 2024

@rosskevin Implementing the #multi_delete method is relatively unimportant. It's nice to have for performance when using versions without backgrounding, but users can just use backgrounding and/or the parallelize plugin.

Though it would be nice to still be able to implement #clear!, which is meant to be used when GCS is also used for temporary storage.

from shrine-google_cloud_storage.

renchap avatar renchap commented on June 19, 2024

@rosskevin If you want to have a look at switching to google-api-client, feel free to do so! When I started this project it was new and did not support handling IO objects. From what I remember this was the main reason not depending on it.

I am fine with allowing presigned urls in url(). I duplicated most of the presign method from google-cloud-storage so it should not be hard to tie back them together.

The SA used to presign the URLs needs to be allowed to be specified as a parameter to the presign/url call when you want to use a different account for presigning. It may be a good idea to allow to specify it when creating the storage too. When you run the code on your computer most of the time you want to use your gcloud credentials, but as you need a service account to presign URLs it could be nice if you can configure it when setting up the Shrine storage (like a presign_sa option on the uploader). Then you would not initialize it when running in production and it uses the discovered GCP credentials, as they will be a service account.

from shrine-google_cloud_storage.

rosskevin avatar rosskevin commented on June 19, 2024

PR submitted, I think it is complete/mergeable.

from shrine-google_cloud_storage.

rosskevin avatar rosskevin commented on June 19, 2024

Ah @renchap, I just saw your note.

You can definitely still sign with your own, I verified that. You can also override and specify your own project when instantiating storage. As far as the myriad of other credential options, I didn't try to do anything else as I don't know the use-case. I do not think it will be hard to expand my implementation once someone raises specifics though.

from shrine-google_cloud_storage.

renchap avatar renchap commented on June 19, 2024

Fixed by #16

from shrine-google_cloud_storage.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.