Giter VIP home page Giter VIP logo

dropbox-v2-api's Introduction

dropbox-v2-api's People

Contributors

adamplocieniak avatar adasq avatar alan-null avatar cdhowie avatar dependabot[bot] avatar mattiasrunge avatar nieapolitycznie avatar reinhardholl avatar simonegosetto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

dropbox-v2-api's Issues

Error: Cannot create a string longer than 0x1fffffe8 characters

I'm trying to download 737Mb file and getting this:

buffer.js:799
    return this.utf8Slice(start, end);
                ^

Error: Cannot create a string longer than 0x1fffffe8 characters
    at Buffer.toString (buffer.js:799:17)
    at Request.<anonymous> (/home/dev/backup-manager/node_modules/dropbox-v2-api/node_modules/request/request.js:1135:39)
    at Request.emit (events.js:327:22)
    at IncomingMessage.<anonymous> (/home/dev/backup-manager/node_modules/dropbox-v2-api/node_modules/request/request.js:1083:12)
    at Object.onceWrapper (events.js:421:28)
    at IncomingMessage.emit (events.js:327:22)
    at endReadableNT (internal/streams/readable.js:1327:12)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  code: 'ERR_STRING_TOO_LONG'
}

Tried to increase --max-old-space-size but it doesn't help

No way of canceling a download

Hi, thanks for the library. I was wondering if there is a way to cancel a download once it is started?

I am working with this here:
dropboxInstance( { resource: 'files/download', parameters: { path: filePath, }, }, (err, result) => { if (err) return console.log(err); console.log('file downloaded!', result); });

I also managed to get download progress by piping everything. But I am unable to cancel the download once started.
Thanks for your help.

Move DROPBOX_TOKEN to a configuration

An idea for an improvement.
Instead of getting DROPBOX_TOKEN from environment variable

process.env.DROPBOX_TOKEN

Better have a configuration file and get it from it.

const config = require('./../config.json');
let token = config.DROPBOX_TOKEN

global config breaks concurrent use

Hi,

thanks for this wrapper package around the Dropbox v2 HTTP endpoints.
I really like your efforts for supporting streams (and I like the simplicity of this package instead of wrapping all API calls in separate methods). I like this more then the official JavaScript SDK from Dropbox, and it feels like this will be closer to the actual available HTTP endpoints.

I have a question though, before I start using this package.

When you use dropbox.authenticate(config), this config is stored globally. I'm afraid that using this package in a system that authenticates several users simultaneously, might break things.

Say the systems authenticates for userA, using his generated (and stored accessToken), do a few calls on the dropbox api and then the system authenticates for another userB, using another accessToken. All subsequent calls for userA will be using the accessToken of userB...

I don't want to call authenticate before every call I make, certainly not when I'm chaining calls (using results of a previous request to start a new request).

I think your package would benefit from a change to localize the config to an instance of your dropbox api-wrapper, so you can start an instance for any user.

What are your thoughts on this?

Tidy async/await solution for avoiding callback hell

For anyone using modern browsers/node, this should help to make the examples usable.

The secret lies in util.promisify!

const util = require('util')
const dbAccessToken = 'YOURTOKEN'
const dropbox = require('dropbox-v2-api').authenticate({ token: dbAccessToken })
const dbSync = util.promisify(dropbox)

async function getAcc () {
  const params = { resource: 'users/get_current_account' }
  try {
    return (await dbSync(params))
  } catch (err) {
    console.error(err)
  }
}

async function test () {
  console.log(await getAcc())
}

test()

Hope this helps someone else avoid a good couple of hours struggling with why the "returns" were returning" undefined"!

How does this compare to the official SDK?

I am just getting into handling my Dropbox with some TypeScript code. I found your library and dropbox-sdk-js. As I did not find a comparison between the two, I'd like to ask:

Why should I use your library over the official one?

Thank you for your time. ๐Ÿ™

Cannot find /dist/api.json

When trying to import the Dropbox API (const dropboxV2Api = require('dropbox-v2-api');)

I get this error:

Error: Could not init action on container (POST http://0.0.0.0:32785/init): responded with error: Initialization has failed due to: Error: ENOENT: no such file or directory, open '/dist/api.json'
    at Object.openSync (fs.js:439:3)
    at Object.readFileSync (fs.js:344:35)
    ...

It appears it cannot resolve the api.json .. which does appear to be in the node_modules/.../dist folder.

Cant use the library on front end (React App) which is also using Node environment

TypeError: fs.readFileSync is not a function
loadResourcesDescriptionList
C:/Users/EG/Desktop/NodeJs/FYP/MeshDrive/meshdrive-frontend/node_modules/dropbox-v2-api/src/dropbox-api.js:213
210 | function noop() {}
211 |
212 | function loadResourcesDescriptionList() {

213 | return JSON.parse(fs.readFileSync(RESOURCES_DESCRIPTION_PATH));
214 | }
215 |
216 | function createDefaultRequestOptObject(resourceDescription){

Please add a simple Upload Session

Hello,

I am the developer of ioBroker.backitup Adapter. We use your API in the adapter

Our project has almost 60.000 installations and unfortunately users complain about the limited upload possibility of files larger than 150 MB.

It would be great if you could integrate a simple upload session.
Currently it is very cumbersome as far as I can see.

An example can be found in the OneDrive API.

There is an uploadSimple and an uploadSession.
In the upload session I only have to specify the filesize of the file and the rest is done by the ondrive-api package.

https://github.com/dkatavic/onedrive-api/blob/master/lib/items/uploadSession.js
https://github.com/dkatavic/onedrive-api#itemsuploadsession

It would be really great if you could integrate this here in the API as well.

Large file download out of memory

I was testing the library and came up against this error after trying to download a 10GB file. The file actually does download correctly (eve if it allocates the asset in memory, so not sure how it would handle a 100GB file), but looks like something that should be addressed somehow.

Running on Node 10, Ubuntu 18, 64GB RAM

<--- Last few GCs --->

[25986:0x3d60660]   260762 ms: Mark-sweep 110.2 (117.4) -> 110.2 (117.4) MB, 51.8 / 0.0 ms  (average mu = 0.966, current mu = 0.000) last resort GC in old space requested
[25986:0x3d60660]   260803 ms: Mark-sweep 110.2 (117.4) -> 110.2 (117.4) MB, 40.5 / 0.0 ms  (average mu = 0.936, current mu = 0.000) last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0x284b725be1d]
Security context: 0x1c5fd0ae9929 <JSObject>
    1: stringSlice(aka stringSlice) [0x27dc1a31c401] [buffer.js:591] [bytecode=0x27dc1a351e89 offset=11](this=0x03d61b5826f1 <undefined>,buf=0x36dd711c6ab9 <Uint8Array map = 0x1416a7c516c9>,encoding=0x03d61b5826f1 <undefined>,start=0,end=1052822574)
    2: toString [0x27863789d531] [buffer.js:668] [bytecode=0x27dc1a351979 offset=145](this=0x36dd711c6ab9 <Uint...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: 0x8daaa0 node::Abort() [node]
 2: 0x8daaec  [node]
 3: 0xad73ce v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
 4: 0xad7604 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
 5: 0xec4c32  [node]
 6: 0xed444f v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [node]
 7: 0xea3ffb v8::internal::Factory::NewRawTwoByteString(int, v8::internal::PretenureFlag) [node]
 8: 0xea4902 v8::internal::Factory::NewStringFromUtf8(v8::internal::Vector<char const>, v8::internal::PretenureFlag) [node]
 9: 0xae5419 v8::String::NewFromUtf8(v8::Isolate*, char const*, v8::NewStringType, int) [node]
10: 0x99e5a8 node::StringBytes::Encode(v8::Isolate*, char const*, unsigned long, node::encoding, v8::Local<v8::Value>*) [node]
11: 0x8f6688  [node]
12: 0xb5faef  [node]
13: 0xb60659 v8::internal::Builtin_HandleApiCall(int, v8::internal::Object**, v8::internal::Isolate*) [node]
14: 0x284b725be1d
Aborted (core dumped)

upload session start/append/finish

I'm working on using upload session start, append and finish for files over 150MB. I'm getting this error, sessionStart error: {}
I have a file on my node server i'm trying to upload. I believe the error has to be in the variables firstUploadChunkStream and secondUploadChunkStream. I'm new to node and streams and might not be understanding what is needed. See my code below, thank you.

`const axios = require('axios');
const {dropboxAccessToken} = require('../../config/dev');
const {dropboxKey} = require('../../config/dev');
const {dropboxSecret} = require('../../config/dev');
const fs = require('fs');
const multer = require('multer');
const upload = multer({dest: 'temp_files_to_upload'});

const util = require('util');
const dropbox = require('dropbox-v2-api').authenticate({ token: dropboxAccessToken });

module.exports = app => {

app.post('/api/testupload/:file/:size', upload.single('myFile'), (req, res) => {
    console.log('begin upload process...');
    const deleteTempFile = (response) => {
        if(response) {
            console.log('file has been successfully uploaded to dropbox...');
            console.log('deleting temp file from /temp_files_to_upload...');
            (async () => {
                try {
                    await fs.promises.unlink(`temp_files_to_upload/${req.file.filename}`);
                }   
                catch(error) {
                    console.log(error);
                }
            })();
        }
    }


    const getResponseFromBigFile = () => {
        const CHUNK_LENGTH = 100;
        
        const firstUploadChunkStream = () => fs.createReadStream(`temp_files_to_upload/${req.file.filename}`, {'1': CHUNK_LENGTH});
        const secondUploadChunkStream = () => fs.createReadStream(`temp_files_to_upload/${req.file.filename}`, {'2': CHUNK_LENGTH});

        sessionStart((sessionId) => {
            sessionAppend(sessionId, () => {
                sessionFinish(sessionId);
            });
        });

        function sessionStart(cb) {
            console.log('session start fired...');
            dropbox({
                resource: 'files/upload_session/start',
                parameters: {
                    close: false
                },
                readStream: firstUploadChunkStream()
            }, (err, result, response) => {
                if (err) { return console.log('sessionStart error: ', err) }
                console.log('sessionStart result:', result);
                cb(result.session_id);
            });
        }


        function sessionAppend(sessionId, cb) {
            console.log('session append fired...');
            dropbox({
                resource: 'files/upload_session/append',
                parameters: {
                    cursor: {
                        session_id: sessionId,
                        offset: CHUNK_LENGTH
                    },
                    close: false,
                },
                readStream: secondUploadChunkStream()
            }, (err, result, response) => {
                if(err){ return console.log('sessionAppend error: ', err) }
                console.log('sessionAppend result:', result);
                cb();
            });
        }

        function sessionFinish(sessionId) {
            console.log('session finish fired...');
            dropbox({
                resource: 'files/upload_session/finish',
                parameters: {
                    cursor: {
                        session_id: sessionId,
                        offset: CHUNK_LENGTH * 2
                    },
                    commit: {
                        path: `/media/${req.file.originalname}`,
                        mode: 'add',
                        autorename: true,
                        mute: false
                    }
                }
            }, (err, result, response) => {
                if (err) { return console.log('sessionFinish error: ', err) }
                console.log('sessionFinish result:', result);
                res.send(result);
            });
        }


    }


    try {
        const filePath = `temp_files_to_upload/${req.file.filename}`;

        //checks if file is uploaded to temp_files_to_upload folder
        fs.access(filePath, fs.constants.F_OK, async (err) => {
            if(err) {
                console.log('file has not been uploaded.');
            }
            //file exists, do dropbox call
            else {
                console.log('file has been uploaded to node server...');
                getResponseFromBigFile();
            }
        });
        
    }
    catch(error) {
      res.send(error);
    }

});

}`

Sometimes not getting a replay

Sometimes I am not getting any response at all:

  stdout: '>>> err {}\n>>> paramResult undefined\n>>> response undefined\n',
dropbox({
  resource  : 'files/upload',
  parameters: {
    path: options.path,
  },
  readStream: fs.createReadStream(options.input),
}, (err, paramResult, response) => {
  console.log('>>> err', err)
  console.log('>>> paramResult', paramResult)
  console.log('>>> response', response)

What could this be? Thanks in advance!

How to connect existing stream and react to events

I am trying to connect a file upload using Fastify and Fastify Multipart and am struggling on how to connect the two. Basically I have a ReadableStream with on() method for data, 'endetc. So mydata` handler might look like:

let fileSize = 0

// Obtain session ID here...
...

// Now handle the 'data' events
uploadedFileStrea.on('data', async (chunk) => {
    fileSize += chunk.length
    await appendSession(emitter, fileSize, sessionId)
})

The problem is this results in a MaxListenersExceededWarning: Possible EventEmitter memory leak detected error. Any idea how to correctly join this together without creating a new emitter for each chunk?

Thank you

Its not really an issue

So i managed to upload a file via nodejs

But i don't know how to use the result, to get a share link to automatically send it somewhere.

Could u please help me?

const dropboxV2Api = require('dropbox-v2-api');
      // create session ref:
      const dropbox = dropboxV2Api.authenticate({
        token: 'XXX'
      });
     // use session ref to call API, i.e.:

dropbox({
  resource: 'files/upload',
  parameters: {
      path: '/dropbox/path/to/myvideo.mp4'
  },
  readStream: fs.createReadStream('./myvideo.mp4')
}, (err, result, response) => {
  if (err) { return console.log(err); }
  console.log(result);
})

Catching "missing_scope"

Err { error_summary: 'missing_scope/...',
error:
{ '.tag': 'missing_scope',
required_scope: 'files.content.write' },
code: 401 }
Res null

Where would I have to specify a scope and how? The dropbox API says it is an optional parameter on authentication.

TypeScript support

Thanks for the module, it helped quite a lot.
I am currently converting to TS and there is neither natively TS support, nor am I able to install them with
npm i --save-dev @types/dropbox-v2-api
Is / will there any TS support / type definition available

Integration test fail (Docs change detection)

Steps to Reproduce the Problem

  1. clone repo
  2. run npm test

Expected Behavior

all tests green

Actual Behavior

  1) Docs change detection  contains equal content:
     Uncaught Error: expected '33826f2fa23e0006fde3e46d4553f864' to equal '409e85a994071597445387a6156859ae'
      at Assertion.assert (C:\repo\dropbox-v2-api\node_modules\expect.js\index.js:96:13)
      at Assertion.be.Assertion.equal (C:\repo\dropbox-v2-api\node_modules\expect.js\index.js:216:10)
      at Assertion.(anonymous function) [as be] (C:\repo\dropbox-v2-api\node_modules\expect.js\index.js:69:24)
      at generate (C:\repo\dropbox-v2-api\test\docs\docs-change-detection.js:20:57)
      at parseBody (C:\repo\dropbox-v2-api\src\generate-api-description.js:187:5)
      at Request._callback (C:\repo\dropbox-v2-api\src\generate-api-description.js:166:3)
      at Request.self.callback (C:\repo\dropbox-v2-api\node_modules\request\request.js:198:22)
      at Request.<anonymous> (C:\repo\dropbox-v2-api\node_modules\request\request.js:1035:10)
      at IncomingMessage.<anonymous> (C:\repo\dropbox-v2-api\node_modules\request\request.js:962:12)
      at endReadableNT (_stream_readable.js:1059:12)
      at _combinedTickCallback (internal/process/next_tick.js:138:11)
      at process._tickCallback (internal/process/next_tick.js:180:9)

Specifications

Env: Windows 10

Response body buffered in memory even when streaming

When passed a callback, the request module will buffer the entire response body in memory even when the request object is streamed elsewhere. This pseudo-code demonstrates the problem:

request(options, callback).pipe(someStream)

In this case, the response body is piped to someStream, but, because a callback is supplied to request(), the response body is also buffered in memory and delivered to the callback function. This defeats the entire purpose of streaming in the first place, and causes a much more serious issue: if a file larger than 1,073,741,799 bytes is downloaded, an uncatchable error is thrown from the request module, taking down the entire application!

Error: Cannot create a string longer than 0x3fffffe7 characters

This is the offending line in this module:

return request(requestOpts, callback).pipe(createTransformStream());

The callback must not be passed to this function.

No way to access response headers

Why would you need access to response headers? Because Dropbox sometimes puts information there. For example, if you get a 429 RateLimitError, the Retry-After header contains the number of seconds to wait before trying again.

Your app is making too many requests for the given user or team and is being rate limited. Your app should wait for the number of seconds specified in the "Retry-After" response header before trying again.

via https://www.dropbox.com/developers/documentation/http/documentation

Can you pass the response as a third argument to callbacks? I'll attempt a PR in a bit.

Uploading data

Is there a way to upload a string of data rather than a file?

Err

Error :- Error in call to API function "files/upload": HTTP header "Dropbox-API-Arg": path: 'pack.zip' did not match pattern '(/(.|[\\r\\n])*)|(ns:[0-9]+(/.*)?)|(id:.*)'
code `dropbox({
resource: 'files/upload',
parameters: {
path: 'pack.zip'
},
readStream: fs.createReadStream('pack.zip')
}, async (err, result, response) => {
//upload completed

if (err) {
return console.log(await err)
}
console.log(result)
})
`

Dependency modules

Hi, I have some issue with your module. This module use "request": "2.67.0". Request module have node-uuid and tough-cookie modules and they make issue for me. i can't compile my electron app.
npm WARN deprecated [email protected]: ReDoS vulnerability parsing Set-Cookie https://nodesecurity.io/advisories/130
npm WARN deprecated [email protected]: Use uuid module instead

Please update your request module for laste version because in last version all these problems are resolved.
Thanks.

Getting a corrupted zip error using default settings.

Error: Corrupted zip : can't find end of central directory

This is the function I am using to download file it works most of the times and sometimes I get that error. Am I doing something wrong here? Thank in advance.

function getExcel(path, localFileName){
return new Promise((resolve, reject)=>{
dropboxV2({
resource: 'files/download',
parameters: {
path: path
}
}, (err, result, response) => {
if(err){
console.log(err);
reject();
}else{
console.log(Downloaded ${localFileName});
resolve();
}
}).pipe(fs.createWriteStream(localFileName));
}).catch((err)=>{
console.log(err);
reject(err);
});
};

Special character handling

on line 43 there needs to be character replacement for umlauts and other special characters

requestOpts.headers[DB_HEADER_API_ARGS] = isObject(userParameters) ? JSON.stringify(userParameters): '';

from: dropbox/dropbox-sdk-js@585c0f2

requestOpts.headers[DB_HEADER_API_ARGS] = isObject(userParameters) ? JSON.stringify(userParameters).replace(/[\u007f-\uffff]/g, (c) => '\\u' + ('000' + c.charCodeAt(0).toString(16)).slice(-4)): '';

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.