Giter VIP home page Giter VIP logo

Comments (4)

vote539 avatar vote539 commented on May 21, 2024 1

It might be worthwhile fiddling with the chunkSize parameter. The server writes the buffers directly to disk as soon as it receives them, so memory shouldn't be a problem. Bandwidth is most likely going to be the bottleneck when handling multiple simultaneous uploads. It would also be useful to do stress-testing on Socket.IO directly to see how much data it can handle being passed through Web Sockets at once.

from socketio-file-upload.

ewanwalk avatar ewanwalk commented on May 21, 2024

This seemed to be a bug with socket.io v1.3.7 and is resolved as of 1.4.5.. however you cannot seem to handle more than one or two large files at one time.

from socketio-file-upload.

ewanwalk avatar ewanwalk commented on May 21, 2024

it actually seemed to be CPU being the bottleneck, I'm trying a method where I use sticky-sessions to cluster socket.io and then uploading to workers essentially giving me more throughput altogether.

The problem I seem to face is not being able to set a limit on upload (currently it maxes out at about 30MB/s across my network) which works on up to approximately 2-3 files before the server stops being able to handle it and errors on reading the size of a file. I'd suspect limiting this would greatly reduce the overall cpu usage due to less data being sent.

I will note that small files are no issue.

Another note, possibly adding a built in queue would be optimal e.g. maxParallelUploads = 2

from socketio-file-upload.

raulrene avatar raulrene commented on May 21, 2024

It might be worthwhile fiddling with the chunkSize parameter. The server writes the buffers directly to disk as soon as it receives them, so memory shouldn't be a problem. Bandwidth is most likely going to be the bottleneck when handling multiple simultaneous uploads. It would also be useful to do stress-testing on Socket.IO directly to see how much data it can handle being passed through Web Sockets at once.

I actually managed to get it working using this suggestion. I was using 64kb chunk-size and reduced it way down to 5kb and now I'm not getting the transport closed error anymore. Used to happen on large files, and especially if more than 1 large file was uploaded at once.

from socketio-file-upload.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.