Giter VIP home page Giter VIP logo

gopro's Introduction

gopro cloud management tools

I like my GoPros and have been using GoPro cloud since around the end of 2019. I wrote a blog post about it in more detail, but in summary, GoPro Plus isn't sufficiently useful to me without some additional tools. This package provides those tools.

Getting Started

Prerequisites:

In order to process streams out of video files, you'll need ffmpeg (most likely available in your system package manager) somewhere on the path.

You'll also need stack to build and install the software.

Installation

Install the tool (which requires stack):

git clone https://github.com/dustin/gopro
cd gopro
stack install

First Run

Now you've got the gopro tool available. Everything the tool does requires a database, which is gopro.db in the directory you run it (i.e., you can have multiple if you just run it from different locations).

Pick a spot and authenticate:

mkdir ~/gopro
cd ~/gopro
gopro auth

This will prompt you for your GoPro Plus username and password. Once that's done, you're ready to go!

Claim Your Data

At this point, you can either use the commandline tools, e.g. gopro sync to pull down all the GoPro metadata into your local database, or you can run gopro serve and do it all via the web interface (assuming you tell it where its static content is).

Commandline Reference

All tools take the following arguments:

  --db ARG                 db path (default: "gopro.db")
  --static ARG             static asset path (default: "static")
  -v,--verbose             enable debug logging
  -u,--upload-concurrency ARG
                           Upload concurrency (default: 3)
  -d,--download-concurrency ARG
                           Download concurrency (default: 11)

Most of them should be obvious, but --static is the location of the web media, which you may need to provide if you're running it somewhere in particular.

Several parameters may be configured with configuration files. The following paths are searched:

  • ~/.config/gopro/config.toml
  • .gopro.toml (in the current directory)

See the example config to see how this works.

auth

The auth command, as mentioned in the "Getting Started" section authenticates you with the service. This authentication token lasts around a day or less.

You should generally only ever need to do this once per gopro.db. If you have multiple installations, each one will need to be authenticated.

reauth

The reauth command will use a stored token to refresh your credentials. You may need to do this periodically, but the software will automatically refresh known stale credentials.. You do not need to supply your password again (i.e., you can just run this on a timer or something if you plan to mostly interact with the web service).

This feature is available via the web interface by clicking on "🔒".

sync

The sync command finds all of your recently uploaded media that the local database does not have yet, and grabs that data.

It also extracts all the metadata and gives you some nice rich local data features.

This feature is available via the web interface by clicking on "🔃".

refresh

The refresh command updates the local metadata for specific MediumIDs. e.g., if you've made changes to highlights or similar on the GoPro site, you'd use this to retrieve the latest changes.

This feature is available via the web interface when looking at specific item details.

upload

The upload command uploads media from your computer. This is similar functionality to what your camera might do, but doing it from your computer gives you a bit more control.

upload in particular does two distinct things:

  1. It creates GoPro-side media (tells them there's something coming and gets IDs) and associates those media with local files.
  2. It gets all the bits of those files uploaded and marks things done.

These two things are technically decoupled, but from a UX point of view, you can start an upload, interrupt it, and then start it again later and it will continue from where you left off.

Technically, if all you want to do is resume an upload, you can just type gopro upload and it will finish any that it knows about, but are not done.

See also: createupload, createmulti

download

The download command copies media from GoPro cloud to a local directory.

serve

The serve command runs a web server and lets you browse and search your media quickly and easily.

Less Common Commands

createupload

The createupload command only executes the first half of the upload sequence. This is useful when you want to define a bunch of uploads for local files and then execute them later. It's also useful when you have multi-part uploads you want to mix in with the upload batch and want to just prepare everything first.

createmulti

The createmulti command creates uploads that span multiple files. There are two main (and a few minor) use cases for this:

  1. Video recordings that exceed individual file length and were split into multiple files.
  2. Timelapse photos (as opposed to video) where a single button press spat out a large number of image files.

While this is less commonly used (at least for me), it's been extremely important functionality that isn't available anywhere other than the camera itself, so it's helped me with ~thousand file time lapses I took with my Hero 4 back in the day, for example.

Usage is a bit tedious, but it's not too bad. First, you have to consider the type of media you're intending to upload. It must be one of the following (exact, case sensitive):

  • Photo
  • Video
  • TimeLapse
  • TimeLapseVideo
  • Burst

Then, you just run the command with the media you want to upload, in order (usually alphabetical, so * tends to work). e.g.:

gopro createmulti TimeLapse *.JPG

This will define an upload that includes all matching JPGs that will be considered a TimeLapse.

After you define your multipart uploads, you can use the regular gopro upload command to complete them.

fetchall

The fetchall command is essentially sync, but without stopping when it sees something it's seen before. It will take longer than sync but should generally do the same thing unless you've rampage deleted some data from gopro.db.

cleanup

The cleanup command cleans unprocessed data both on the GoPro side as well as the local state regarding what is being uploaded.

This only includes files that aren't expected to be fully uploaded. i.e., items in registered, uploading, or failed states. It does not include items in transcoding or similar states where GoPro has the item in full and is processing it.

If you've ever tried to upload things from the web UI and had it tell you the media's already been upload (when it hasn't), or if you started an upload you don't intend to finish and want to get rid of in-progress stuff, the cleanup command will delete all of these in-progress things.

config

The config command lets you view and update configuration parameters. It's got three modes of execution.

To list all configuration parameters and their values:

gopro config

To display the value of the configuration parameter bucket:

gopro config bucket

To set the value of the configuration parameter to some.bucket.name:

gopro config bucket some.bucket.name

wait

The wait command just waits for in-progress uploads and (GoPro-side) transcoding to finish. sync does this automatically, but if you want to wait for some other reason without syncing, this will do it.

reprocess

The reprocess command tells the GoPro cloud service to reprocess any uploads that fell into a failed state (as seen by the wait command.

fixup

The fixup command allows you to write a SQL query against your local metadata database that will update fields in GoPro's service for any matching rows.

This is a fairly advanced and potentially terribly destructive thing to do. Everything possible to do here is well beyond the scope of a README, but at a high level, you write a query that returns rows whose column names must include media_id for the item you wish to update, and then additional columns whose names match metadata field names in GoPro metadata. By the time you get to the point where you're doing this, you probably know all this.

As an example, when uploading from some source (maybe my phone?) or in some fashion I don't remember, GoPro seemed to not know which camera various media camera from. But the local metadata knew because it read it directly from the GPMF data. I wanted GoPro's metadata to be consistent, so I wrote the following query:

select m.media_id, g.camera_model
    from media m join meta g on (m.media_id = g.media_id)
    where m.camera_model is null
    and g.camera_model is not null

This query spat out every row where GoPro didn't know the camera model, but I was able to derive it from metadata. I fed that to the fixup command and all my metadata on their end was happily updated. (I'd still need to use refresh to update the local copies).

backup

The backup command will orchestrate a move of all original media stored in GoPro to your own S3 bucket. Lots of stuff is involved in setup here including that S3 bucket, an AWS Lambda copy function and an SQS queue.

This does work, and should be able to move a huge amount of data in a short amount of time. The tl;dr on how to use is:

  1. Create an S3 bucket in us-west-2 (Oregon) to store all your stuff. Note that this is used for both metadata cache and backups.
  2. Create a lambda function (I called it download-to-s3, but it's configurable) in us-west-2 as a node.js runtime with index.js containing the contents from the file lambda/download-to-s3.js
  3. Set up an SQS queue, also in us-west-2 to capture the results.
  4. Configure destinations for both success and failure to this new SQS queue.
  5. Configure (using gopro config) s3copySQSQueue to point to your new SQS queue, bucket to point to your S3 bucket, and s3copyfunc config variable is pointed to the correct function name (i.e., change it if it's not download-to-s3).

With this AWS-side infrastructure in place, gopro backup should copy all the things.

This primarily exists as a proof of concept that I hope I won't ever need, as I don't mind using GoPro's storage (it's S3) and paying for the whole thing to be reproduced in my own storage kind of makes the service less useful.

When I do, I suspect I should be able to move my >TB storage from GoPro's buckets to my own with a tiny amount of bandwidth to my house.

backupall

The backupall command is similar to the backup command, but grabs all known media stored on your behalf, including derivatives.

backuplocal

The backuplocal command is similar to the backup command, except it copies data locally instead of to an S3 bucket (and runs entirely locally).

Given an argument for the destination path, it will attempt to download all original artifacts for a given medium and once complete, will move the destination directory into place. Once a directory for a given medium is in place, it will not attempt to download the same medium again (i.e., if you delete the directory for a medium, it will be redownloaded).

backuplocalall

The backuplocalall command is similar to the backuplocal commnand, except it fetches all known media instead of just originals.

processSQS

The processSQS command is automatically run by the backup command to catch results of asynchronous lambda function calls, but if the process is interrupted or you just want to make sure you've picked up everything, this command can be invoked separately without potentially issuing more copy requests.

clearmeta

The clearmeta command removes any local metadata storage for any medium whose metadata is backed up on S3. This can save a lot of space if you have a lot of video assets.

gopro's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gopro's Issues

Is it possible to serve content from local backup incase gopro cloud subscription runs out

First of all, thanks a lot for this epic tool.
It might have been answered but I was not able to find it, I want to use the tool but locally only going forward.
My gopro plus subscription has run out so I am just downloading all my data and planning to backup to a local medium going forward.
What would be the best way to tackle this. Did not knew what would be the best place to ask so thought of creating an issue. Please feel free to close it if does not fall into place.
Thanks again!!

Error 422 during upload

Hello,

I have a problem. During my upload process, I get the following error:

StatusCodeException (Response {responseStatus = Status {statusCode = 422, statusMessage = "Unprocessable Entity"}

and direct under that, there is another one:

"{"_errors":[{"code":6105,"description":"cache value is nil for this request: {DerivativeID:2381467954777687236 UploadID:n3sa.1SZig0dRoUoyUFNhsNgD1UTCqrywDS3IvDAWzH4ZwWunImUIcvdY7leQa0XYWvJjyiUIjqmv4yLzqoOoXSD.gui7WsTPmr9xc1AKTTe7mXDe7HjUSdTbwydRVqy ItemNumber:1 CameraPosition:default TranscodeSource: FileSize:2732485278 PartSize:6291456 Page:1 PerPage:435 _:0}"}

do i miss something?

Sync Issue

Received the following error that would halt sync from continuing.

gopro: SQLite3 returned ErrorError while attempting to perform prepare "insert into files (media_id, section, label, type, item_number, file_size) values (?, ?, ?, ?, ?)": 5 values for 6 columns.

I was able to get it running by simply adding an additional arg (?)

Request: docker container

Thank you for the great contribution.

However, I really need this to be in a docker container to make use of it in my environment.

Could I please ask you put together an image? It would increase adoption, no doubt.

gopro sync fails

command gopro sync fails with Error below .
At the same time gopro upload works fine.

I: 6028 new items
I: Storing batch of 100
gopro: HttpExceptionRequest Request {
  host                 = "images-02.gopro.com"
  port                 = 443
  secure               = True
  requestHeaders       = [("Content-Type","application/json"),("Accept","application/vnd.gopro.jk.media+json; version=2.0.0"),("Authorization","<REDACTED>"),("User-Agent","github.com/dustin/gopro 0.1")]
  path                 = "/resize/450wwp/qweyJhbGciOiJIUzI1NiJ9.ehgggyJtZWRpdW1fasssWQiOiIxOTU1Mzc1MTc2ODIwNDU4NTM2Iiwib3duZXIiOiI1NDVlYzZiYS03YWJlLTQ0YTctOTZkNy1iMTFlYTQxN2Q0NTUiLCJpc19wdWJsaWMiOmZhbHNlLCJvIjoxLCJ0cmFucyI6bnVsbCwicmVnaW9uIjoidXMtd2VzdC0yIiwidGh1bWJuYWlsX3VwZGF0ZWRfZGF0ZSI6bnVsbH0.3k-ro9BJl5OK-Z1duI4xLT6qCMd6KbUWuwuBAMEDRm4"
  queryString          = ""
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
}
 (StatusCodeException (Response {responseStatus = Status {statusCode = 404, statusMessage = "Not Found"}, responseVersion = HTTP/1.1, responseHeaders = [("Content-Type","image/jpeg"),("Content-Length","272"),("Connection","keep-alive"),("Date","Wed, 22 Sep 2021 08:02:21 GMT"),("Server","nginx"),("Vary","Accept-Encoding, Origin"),("Access-Control-Allow-Credentials","true"),("Strict-Transport-Security","max-age=31536000; includeSubDomains"),("X-Cache","Error from cloudfront"),("Via","1.1 2f194b62c8c43859cbf5af8e53a8d2a7.cloudfront.net (CloudFront)"),("X-Amz-Cf-Pop","FRA2-C2"),("X-Amz-Cf-Id","Um2L9ajXeqJdT88GlVNQnGzyQQ7TfOdSpwqJT61ZHQKX3uKB-xuO0g==")], responseBody = (), responseCookieJar = CJ {expose = []}, responseClose' = ResponseClose}) "\255\216\255\224\NUL\DLEJFIF\NUL\SOH\SOH\NUL\NULH\NULH\NUL\NUL\255\219\NUL\132\NUL\SOH\SOH\SOH\SOH\SOH\SOH\STX\SOH\SOH\STX\ETX\STX\STX\STX\ETX\EOT\ETX\ETX\ETX\ETX\EOT\ACK\EOT\EOT\EOT\EOT\EOT\ACK\a\ACK\ACK\ACK\ACK\ACK\ACK\a\a\a\a\a\a\a\a\b\b\b\b\b\b\t\t\t\t\t\v\v\v\v\v\v\v\v\v\v\SOH\STX\STX\STX\ETX\ETX\ETX\ENQ\ETX\ETX\ENQ\v\b\ACK\b\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\255\192\NUL\DC1\b\NUL\n\NUL\n\ETX\SOH\"\NUL\STX\DC1\SOH\ETX\DC1\SOH\255\196\NULL\NUL\SOH\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\a\DLE\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\SOH\SOH\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\b\t\DC1\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\255\218\NUL\f\ETX\SOH\NUL\STX\DC1\ETX\DC1\NUL?\NUL\155\128v%[\255\217")
root@005ceca2fddf:/mnt/Scans# gopro sync
I: 6028 new items
I: Storing batch of 100

JSONError Unexpected MediumType Audio

While running gopro sync I received the following error.
JSONError "Error in $.type: Unexpected MediumType: \"Audio\""

I found that issue #14 was another Unexpected MediumType that was easily resolved.

I see that the error is thrown in dep pkg here: https://github.com/dustin/gopro-plus/blob/master/src/GoPro/Plus/Media.hs#L115 although I don't believe a distinctive Audio MediumType would fall into the photo "bucket".

I recently did make a GoPro edit on mobile using a local soundfile so I'm supposing this is what's being seen all of a sudden.

Stack install Error While building package postgresql-libpq

On commit 330b897 I get a stack install error:

Fedora 38 (6.5.8-200.fc38.x86_64)

postgresql-libpq             > configure
postgresql-libpq             > [1 of 3] Compiling Main             ( /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/Setup.hs, /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/Main.o )
postgresql-libpq             > [2 of 3] Compiling StackSetupShim   ( /home/relentless/.stack/setup-exe-src/setup-shim-Z6RU0evB.hs, /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/StackSetupShim.o )
postgresql-libpq             > [3 of 3] Linking /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/setup
postgresql-libpq             > Configuring postgresql-libpq-0.9.5.0...
postgresql-libpq             > Error: setup: The program 'pg_config' is required but it could not be found.
postgresql-libpq             >
crypton                          > copy/register
crypton                          > Installing library in /home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/lib/x86_64-linux-ghc-9.4.6/crypton-0.33-E0XHNcGjfq55myX2eFTwG6
crypton                          > Registering library for crypton-0.33..
cryptonite                       > copy/register
cryptonite                       > Installing library in /home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/lib/x86_64-linux-ghc-9.4.6/cryptonite-0.30-K750s6VpRDbLPZF227pMDg
cryptonite                       > Registering library for cryptonite-0.30..
Progress 93/252            

--  While building package postgresql-libpq-0.9.5.0 (scroll up to its section to see the error) using:
      /tmp/stack-9c92f5e8c1152d4e/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/setup --verbose=1 --builddir=.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0 configure --with-ghc=/home/relentless/.stack/programs/x86_64-linux/ghc-tinfo6-9.4.6/bin/ghc-9.4.6 --with-ghc-pkg=/home/relentless/.stack/programs/x86_64-linux/ghc-tinfo6-9.4.6/bin/ghc-pkg-9.4.6 --user --package-db=clear --package-db=global --package-db=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/pkgdb --libdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/lib --bindir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/bin --datadir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/share --libexecdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/libexec --sysconfdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/etc --docdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/doc/postgresql-libpq-0.9.5.0 --htmldir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/doc/postgresql-libpq-0.9.5.0 --haddockdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/doc/postgresql-libpq-0.9.5.0 --dependency=Cabal=Cabal-3.8.1.0 --dependency=base=base-4.17.2.0 --dependency=bytestring=bytestring-0.11.5.1 --dependency=unix=unix-2.7.3 -f-use-pkg-config --exact-configuration --ghc-option=-fhide-source-paths
    Process exited with code: ExitFailure 1

Postgres's dependencies is missing

During the installation process, I discovered that the installation of Postgres was necessary. Although not a significant issue, I believe it would be beneficial to include this information in the readme.md file.

Specifically, the only requirement for installation is the libpq-dev package.

Installing on Mac M1

During stack install I get same issue as in https://gitlab.haskell.org/ghc/ghc/-/issues/20592

I see that it is fixed on newer GHC version so I pass --resolver ghc-9.2.4 (I also have tried nightly).

Then I get multiple errors about dependencies:

Error: While constructing the build plan, the following exceptions were encountered:

In the dependencies for aeson-2.0.3.0:
[...]
In the dependencies for x509-validation-1.6.12:
    asn1-encoding must match >=0.9 && <0.10, but the stack configuration has no specified version  (latest matching
                  version is 0.9.6)
    asn1-types must match >=0.3 && <0.4, but the stack configuration has no specified version  (latest matching version
               is 0.3.4)
    x509 must match >=1.7.5, but the stack configuration has no specified version  (latest matching version is 1.7.7)
needed due to gopro-0.1.0.0 -> x509-validation-1.6.12

Some different approaches to resolving this:

  * Recommended action: try adding the following to your extra-deps in /Users/xxx/code/gopro/stack.yaml:

- RSA-2.4.1@sha256:b52a764965cd10756646cc39eadcbc566e131181a75f2a13c621697f4b06d76b,2467
- ...

So I do as recommended.

Finally, I get following error:

gpmf > Building executable 'gpmf' for gpmf-0.1.1.1..
gpmf > [1 of 2] Compiling Main
gpmf >
gpmf > /private/var/folders/k7/s7p_419d4x91m_mbj6ggxtqw0000gn/T/stack-b3826990cd58aa9d/gpmf-0.1.1.1/app/Main.hs:14:24: error:
gpmf >     Not in scope: ‘BL.putStrLn’
gpmf >     Perhaps you meant one of these:
gpmf >       ‘BL.putStr’ (imported from Data.ByteString.Lazy),
gpmf >       ‘BS.putStr’ (imported from Data.ByteString)
gpmf >     Module ‘Data.ByteString.Lazy’ does not export ‘putStrLn’.
gpmf >    |
gpmf > 14 |   either print (mapM_ (BL.putStrLn . maybe "" showDEVC . uncurry mkDEVC)) . parseGPMF =<< BS.readFile fn
gpmf >    |                        ^^^^^^^^^^^
Progress 1/2

Is it possible to use this script on Mac OS with M1 processor?
I installed it on x86_64 Linux without problem but I prefer Mac OS.

gopro: Prelude.read: no parse

First of all, thank you for creating this tool!
It saved me from having to manually download each video from the cloud, now that I'm about to cancel the subscription.

Just wanted to flag an issue to you, the gopro sync and gopro fetchall commands were failing with gopro: Prelude.read: no parse

I eventually figured out the list of all media and ran the refresh command on each of them, after that, I was able to download them.

A specific issue example:

gopro refresh xxxxx
I: Processing batch of 1
gopro: Prelude.read: no parse

Not sure exactly why it's failing, but my library of 65 items had 2 errors.

Perhaps if you added a printout of the response from the API request into the verbose option, it would be easier to figure out as adding -v currently doesn't do much.

I've got a month left on my subscription so happy to help you figure out why the error happens before all my content is deleted from there.

Regards
Matt

Be able to pull only the meta data

Hi Dustin,
I would like to use your library to develop a desktop app like google drive, but for GoPro cloud.
Because I have way to much videos to be able to download them all.

What I mean by that is I would like to be able to sync my videos without the need to download them.
I want to see all the videos from the Gopro cloud on my local machine based on the meta data only, and be able to download an existing video or upload new ones.

But unfortunately your tool do not let me download the meta data only.
Is there a way I could achieve my goal using your tool?

Thank you!

Windows build failure

Hi, sorry for this newbie question but I'm not so acquainted with the terminal and this kind of setup. Are there any simple steps I could take to solve these dependence errors?

image

failed to parse field 'extra-deps' when installing

I get this:
jaw@wormnethub:~/gopro$ sudo stack install Could not parse '/home/jaw/gopro/stack.yaml': Aeson exception: Error in $['extra-deps'][1]: failed to parse field 'extra-deps': (Invalid package identifier: "crypton-0.33@sha256:5e92f29b9b7104d91fcdda1dec9400c9ad1f1791c231cc41ceebd783fb517dee,18202","crypton-0.33@sha256:5e92f29b9b7104d91fcdda1dec9400c9ad1f1791c231cc41ceebd783fb517dee,18202") See http://docs.haskellstack.org/en/stable/yaml_configuration/
What to do? I'm on ubuntu 18.04 LTS

Sync fails with Unauthorized 401 message

The app worked fine but after a few hours, the sync command is failing. I'm getting and Unauthorized 401 message.
I thought it might be a token generated during the authentication expired so I tried reauth command, auth to start over but none of them fixed it. I'm wondering if they updated the API or revoked the Api Key you are using.
Did you face the same issue before? Could it be a DDos protection mechanism?
It feels like it is Cloudfront that is blocking me, but I can't figure out why?

Integrate with GoProX

Hi,

Would love to integrate with GoProX workflow.

Primarily to upload processed media to GoPro Plus. What would be the most lightweight way to make this happen (CLI only)?

Very much appreciate what you are doing!

Error after refresh web

**I was unable to load the media: bad body: Problem with the value at json[3].height:

null

Expecting an INT**

When syncing the first time the web showed some content, but after a while a refresh gives this message (got a couple of 100 items from 3119)
Is there a corrupt image in the gopro cloud, how can I find the one causing this problem of fix this?

Sqlite database keeps record of a media even after deleting it via GoPro Cloud official web interface

I found a file I didn't want to keep in the GoPro Cloud official web interface. Therefore, I deleted it. Even after using 'gopro sync', I realised the database still keeps record of it.

I tried using 'gopro reprocess', even though it is not made for deletions (just in case), and it didn't fix it.

Firstly, is it somehow marked as deleted? Or is it a bug? Or maybe the tool is just simply thought to be always used with the custom web interface?

Secondly, do you think just dropping the record from the media table would work to fix it?

P.S. Thanks for publishing your amazing tool!!

Clearing error trying to upload after using createupload command

First of all, thanks for the effort to build this tool I have a back catalog of originals I want to upload to GoPro cloud so I've been trying to get my head around all the commands on my Mac.

I tried using the createupload command to queue up some uploads but now when I run upload there is an error.

I: Have 71 media items to upload in 71 parts with a total of 3180 chunks (19080 MB)
I: Uploading "KRdpO2eB7E3Zp" in 1 chunks (6 MB)
gopro: <file with spaces>.jpg: getFileStatus: does not exist (No such file or directory)

Even if I try to use the upload command and supply a different path of files it still fails with the same error.

Is there a way to clear the queue and try createupload again?

Lastly, is there a way to either use upload or createupload and choose to include sub directories (recursive)? My footage is in folders like:

/Volumes/WD/2019/
 -> 2019-09-21/HERO4 Silver 1
 -> 2019-09-22/HERO4 Silver 1
 -> 2019-09-23/HERO4 Silver 1
 -> 2019-09-24/HERO4 Silver 1

It seems as though I have to enter the path name of the actual GPro MP4 files.

I was hoping for something like:

gopro createupload /Volumes/WD/2019 -R
gopro upload

Web app with 'gopro serve' does not work

Hello! I started using your tool since I found it very useful, and trying to make it work.

When I use 'gopro serve', the web server starts but going to 'localhost:8008' only brings up 'Something went wrong' and the console displays 'static/index.html: withBinaryFile: does not exist (No such file or directory)'.

Do you know if it is a setup issue or a bug?

gopro serve - web-server blocks when starting up

Hello and first thing first: thanks for the huge work. This tool could be very time-saving for a lot of people!

I'm working on a Mac M1 with Big Sur 11.0.1.

I've installed ffmpeg, and followed the instruction provided here. All seemed to have worked fine.
I've also added to path the gopro user directory, as suggested.

Unfortunately when I launch the webUI with gopro serve the terminal remains blocked with no other information (even with --verbose).

I: Starting web server

Syncing (even if not all files are read, maybe the ReelSteadyGO ones) seems to work fine.
I would like to use the webGUI to batch download date-filtered files

How could I fix it? Am I missing something?

Thank you so much

Thanks Dustin for the amazing work! Can't understand why I added only the 10th star...

Could not find gpmf-0.1.1.1 on Hackage

Following the installation instructions today I got the following error on stack install

_@_:~/gopro$ stack install
Cabal file info not found for gpmf-0.1.1.1, updating
Selected mirror https://hackage.haskell.org/
Downloading timestamp
Waiting to acquire cache lock on /home/_/.stack/pantry/hackage/hackage-security-lock
Acquired cache lock on /home/_/.stack/pantry/hackage/hackage-security-lock
Released cache lock on /home/_/.stack/pantry/hackage/hackage-security-lock
No package index update available and cache up to date
Package index cache populated
Could not find gpmf-0.1.1.1 on Hackage
Perhaps you meant hmp3, html, time, stm, pqc, gd, exif, Hmpf, hpc, or Diff?

I installed stackfrom scratch, after also having failed in the same way for package installed stack

GoPro Upload Issue: Excessive Requests, Directory Restructuring, and Non-existent File Errors

Description:

I have been using an application to upload a large number of files from various sources to the GoPro cloud. This application has been a godsend, helping me consolidate and upload files that were scattered across multiple clouds and drives, some of which I was unsure if I had already uploaded to the GoPro cloud. However, I have encountered several issues and have had to create scripts in Haskell to support my use case and extend the functionality of the application.

Background:

I have been trying to rectify a lack of system organization dating back to 2017. I have a large number of files and exports scattered across different clouds and drives, and I am trying to ensure that all these files are backed up to the GoPro cloud for peace of mind in case of HDD failure.

Enhancements Made:

  1. Developed a Haskell script to automate the GoPro upload process from a shell script on Mac. The script notifies the user upon completion and triggers an alert that requires user interaction to finish, ensuring a smooth workflow.
  2. Created a validation script to ensure upload paths do not contain strings, which can be problematic when sourcing from Google Drive due to the unremovable space in "My Drive".
  3. Wrote a script to flatten directory's nested folders, addressing the issue of previously unexplainable organization methods.
  4. Developed a script to partition a directory into subdirectories of 30 files each, as GoPro seems to prefer this number and it matches the maximum number of notifications shown on the status upload page when a duplicate is encountered.
  5. Extended GoPro upload functionality with a script that accepts nested directories and executes uploads on each directory sequentially.
  6. Implemented a sync script to log out to a specified directory.

Current Problem:

After restructuring directories to flatten all files and then partition them back sequentially by filename into chunks of 30, I encountered an issue. The GoPro upload command is failing with an error stating that the file it's trying to finish does not exist. This is presumably because the file was moved or deleted during the partitioning process. This issue arose when I attempted to upload around 250 files out of 4TB of footage.

Potential Solution:

I am considering running the GoPro cleanup command to stop expecting uploads from the desktop. However, I am concerned that this might also stop and close files that have finished the uploading phase and are still in processing. A safer alternative might be to inform the server to stop waiting for an upload that is no longer coming, while ensuring that the server continues to process and finish any files that are currently in a processing state. I am looking for confirmation or a workaround for this issue.

Individual/subset media download command

Introduce a similar to backuplocal, but allow specifying the media to be downloaded.

Specification for downloads may be medium IDs or perhaps some kind of query (e.g., time relative or maybe even a small expression language).

Unsupported Parser

Would really like an alternative to the unworkable GoPro App and I'm more than comfortable with CLI, but I don't know any Haskell so I'm stumped by this error.

Having followed the install instructions, have authenticated and running the first 'gopro sync' it worked it's way though more the 200 of the 244 items and now I get

gopro -v sync
D: Reading auth token from DB
I: 0 new items
I: Fetching meta 0
D: Need meta: []
I: Updating ("o6Ra0w75aMqnV",GPMF)
gopro: unsupported parser: 'S'
CallStack (from HasCallStack):
error, called at src/GoPro/GPMF.hs:129:20 in gpmf-0.1.0.3-350fHnVuGmDJDL1tyyYPbe:GoPro.GPMF

If you're able to shed any light on this so I can complete the sync that would be appreciated.

Thanks for sharing these tools

Consider the possibility of downloading only originals

Hi,

I'm using your fabulous program to create a local backup from my gopro+ data on a Synology NAS. Everything works perfects so far. But as the program downloads every version of the GoPro+ content, the downloaded data is quite large.

I'd prefer to only download the original files, ignoring the proxy files or thumbnails. Is this thinkable?

Thanks for your work!
Michael

Reuploading restarts the whole process

I am using the software through docker. As I have recently subscribed to the plus I am trying to upload my existing library I have stored locally. When I upload the files it works fine until there is an error (typically timeout). Now if I again run the upload it will restart the whole process by creating new upload IDs and then start the process all over again including the files that were successfully uploaded instead of continuing the existing remaining unfinished uploads. I assume it can be due to me running through Docker.

Using the following command:
docker run --interactive --volume $PWD:/usr/lib/gopro --volume "path/to/the/media:/data" --rm dustin/gopro:master gopro upload /data -v

Could it be other files that should be kept persistent for it to keep the current upload status?

Unexpected MediumType: "MultiClipEdit" while performing first sync

I've installed it in a Raspberry, and I'm trying to run it for the first time.

I've performed the auth steps and now I'm trying to run the sync command for the first time and it's failing with the following error:

$ gopro sync -v
D: Reading auth token from DB
gopro: JSONError "Error in $.type: Unexpected MediumType: \"MultiClipEdit\""

The same error happens when trying to run fetchall.

It's a type of Media that was created using GoPro app and was available in the GoPro Media Library, filtering by Edits. I've already deleted it from there but it still fails.

Please let me know if there are additional ways to debug the issue.

GoPro Max files are not supported

Hi,

I want to sync my GoPro Max files with this program, but I get the following error.

I: Ignoring some unknown files: ["GS010128.1.360"]
I: Have 0 media items to upload in 0 parts with a total of 0 chunks (0 MB)

Troubleshooting Request - metadata syncing to bucket but not media

1st, thank you for this. I was looking for a solution to backup gopro videos from their portal directly to s3.

I believe I've setup lambda, sqsqueue, and bucket as required with appropriate perms. When running a gopro backup, i appear to be pushing all metadata to the s3 bucket, but no videos.

The tool is looping with this status:
I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying

My function logging indicates the download req is being made:
image

SQSQueue is only showing empty recieves.
image

Wondering if this info may be able to provide a quick pointer on how to dig further. Any assistance appreciated - thank you for the tool!

Internal Server Error using Docker image

Steps to reproduce:

$ docker run -it --entrypoint /bin/sh dustin/gopro:master
$ gopro auth
Enter email: [email protected]
Enter password:
# gopro sync -v
D: Reading auth token from DB
gopro: HttpExceptionRequest Request {
  host                 = "api.gopro.com"
  port                 = 443
  secure               = True
  requestHeaders       = [("Content-Type","application/json"),("Accept","application/vnd.gopro.jk.media+json; version=2.0.0"),("Authorization","<REDACTED>"),("Accept-Language","en-US,en;q=0.9"),("Origin","https://plus.gopro.com"),("Referer","https://plus.gopro.com/"),("User-Agent","github.com/dustin/gopro-plus 0.6.0.3")]
  path                 = "/media/search"
  queryString          = "?fields=captured_at,created_at,file_size,id,moments_count,ready_to_view,source_duration,type,token,width,height,camera_model&order_by=created_at&per_page=100&page=0"
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
  proxySecureMode      = ProxySecureWithConnect
}
 (StatusCodeException (Response {responseStatus = Status {statusCode = 500, statusMessage = "Internal Server Error"}, responseVersion = HTTP/1.1, responseHeaders = [("Content-Type","application/json; charset=UTF-8"),("Content-Length","46"),("Connection","keep-alive"),("Date","Tue, 09 Apr 2024 21:06:59 GMT"),("Server","nginx"),("X-Request-Id","eac39c6e21170d95324ce85cb5d23f31"),("X-Runtime","0.021540"),("Vary","Accept-Encoding, Origin"),("Access-Control-Allow-Origin","https://plus.gopro.com"),("Access-Control-Allow-Credentials","true"),("Strict-Transport-Security","max-age=31536000; includeSubDomains"),("X-Cache","Error from cloudfront"),("Via","1.1 08c5e904e2f0226b2d9c1417f32b12f2.cloudfront.net (CloudFront)"),("X-Amz-Cf-Pop","ZRH50-C1"),("X-Amz-Cf-Id","u_3ctzxlejLhyrAsBAbk0FZpMYmGHtIHj1c9AbSwtgN7OO5pOQyEIg==")], responseBody = (), responseCookieJar = CJ {expose = []}, responseClose' = ResponseClose, responseOriginalRequest = Request {
  host                 = "api.gopro.com"
  port                 = 443
  secure               = True
  requestHeaders       = [("Content-Type","application/json"),("Accept","application/vnd.gopro.jk.media+json; version=2.0.0"),("Authorization","<REDACTED>"),("Accept-Language","en-US,en;q=0.9"),("Origin","https://plus.gopro.com"),("Referer","https://plus.gopro.com/"),("User-Agent","github.com/dustin/gopro-plus 0.6.0.3")]
  path                 = "/media/search"
  queryString          = "?fields=captured_at,created_at,file_size,id,moments_count,ready_to_view,source_duration,type,token,width,height,camera_model&order_by=created_at&per_page=100&page=0"
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
  proxySecureMode      = ProxySecureWithConnect
}
}) "{\"status\":500,\"error\":\"Internal Server Error\"}")

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.