Giter VIP home page Giter VIP logo

aws-s3-multipart-upload's Introduction

AWS-S3-multipart-upload

Multipart Upload which allows faster, more flexible uploads into Amazon S3. Multipart Upload allows you to upload a single object as a set of parts. After all parts of your object are uploaded, Amazon S3 then presents the data as a single object.

Multipart uploading is a three-step process: You initiate the upload, you upload the object parts, and after you have uploaded all the parts, you complete the multipart upload

Prerequisite

NODE - 10.x
NPM - 6.x

Install Nodemon to launch a server: npm install -g nodemon

Once cloned go to the project folder

cd frontend/
npm run install
npm run start

cd backend/
npm run install
npm run start

If you are intrested in code walk through you can go thorugh youtube link, here I have explain how this works. https://www.youtube.com/watch?v=42Fq6aJvX0w

aws-s3-multipart-upload's People

Contributors

abhishekbajpai avatar dependabot[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

aws-s3-multipart-upload's Issues

not getting etag header

Hi Abhishek,

i am not getting the headers. Implemented same like you but the headers it contains etag that is not getting.
Getting an error like can not read property "header" of null.
Could you please help me.

      let uploadPartsArray = []
      resolvedArray.forEach((resolvedPromise, index) => {
        uploadPartsArray.push({
          ETag: resolvedPromise.headers.etag,
          PartNumber: index + 1
        })
      }) 

getting error.not able to upload 500 mb video

HI . i need urgent help. please help me . i am trying to upload video of size 500 mb. but all the requests are getting failed to s3 server... i dont konw why becuase the code as it is same. kindly please help

Can't upload file chunks with presigned URL as you suggested

Hi, I need some help. I watched your video tutorial and I'm trying to imitate it for my project.
I'm using AWS lambda as backend and have modified my code slightly to my work as following

HTML

<input  type="file"  id="multipartInput">
<button  id="multipartInputBtn">send file</button> 

JavaScript

document.getElementById('multipartInputBtn').addEventListener('click', async () => {
	const multipartInput_fileInput = document.getElementById('multipartInput');
	const file = multipartInput_fileInput.files[0];
	const fileName = file.name;
	const fileSize = file.size;
	const url = `https://uniquestring.execute-api.ap-south-1.amazonaws.com/dev`;

	try {
		let res = await axios.post(`${url}/getUploadId`, { fileName: fileName });     // << here it shows the error
		const uploadId = res.data.uploadId;
		console.log(res);
		console.log('Inside uploadMultipartFile');
		const chunkSize = 100 * 1024 * 1024; // 100MiB
		const chunkCount = Math.floor(fileSize / chunkSize) + 1;
		console.log(`chunkCount: ${chunkCount}`);

		let multiUploadArray = [];

		for (let uploadCount = 1; uploadCount < chunkCount + 1; uploadCount++) {
			let start = (uploadCount - 1) * chunkSize;
			let end = uploadCount * chunkSize;
			let fileBlob = uploadCount < chunkCount ? file.slice(start, end) : file.slice(start);

			let getSignedUrlRes = await axios.post(`${url}/getUploadPart`, {
				fileName: fileName,
				partNumber: uploadCount,
				uploadId: uploadId
			});
			let preSignedUrl = getSignedUrlRes.data.preSignedUrl;
			console.log(`preSignedUrl ${uploadCount} : ${preSignedUrl}`);
			console.log(fileBlob);
			// Start sending files to S3 part by part

			let uploadChunck = await axios.put(presignedUrl, fileBlob)   // << here it shows the error
			console.log(uploadChunck)
			

		}
		
	} catch (err) {
		console.log(err, err.stack);
	}
});

I'm successfully able to get pre-signed url for many parts but I'm not able to upload it. Could you please tell me what should I do here to Improve the process?

S3_ operations sending files - Google Chrome 10-03-2021 22_07_04

No Able to Upload File larger than 200 MB . getting Error - Request failed with status code 400

I am using the same code but for files greater than 200 MB I am getting this error -
Request failed with status code 400

data: 'RequestTimeoutYour socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.378A7403B67A4C74PFSqjGXGIXP++n7fpSW1TaW8caWF2xZHpTvtcPFBAhGZpE8XuqiXDIckWOisKEnBarNi6Fd0daEO2tYinOI45c1iZ8xVYGGw'

how to setup project?

Thanks for making this available. Can you provide some type of documentation to setup the project? I looked at your video but I could not find the steps to set the project.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.