Giter VIP home page Giter VIP logo

aws-sdk-js's Introduction

AWS SDK for JavaScript

NPM version NPM downloads Gitter chat

Build Status Coverage Status Known Vulnerabilities

Version 2.x Upcoming End-of-Support

We announced the upcoming end-of-support for AWS SDK for JavaScript v2. We recommend that you migrate to AWS SDK for JavaScript v3. For dates, additional details, and information on how to migrate, please refer to the linked announcement.

The AWS SDK for JavaScript v3 is the latest and recommended version, which has been GA since December 2020. Here is why and how you should use AWS SDK for JavaScript v3. You can try our experimental migration scripts in aws-sdk-js-codemod to migrate your application from v2 to v3.

To get help with your migration, please follow our general guidelines to open an issue and choose guidance. To give feedback on and report issues in the v3 repo, please refer to Giving feedback and contributing.

Watch this README and the AWS Developer Tools Blog for updates and announcements regarding the maintenance plans and timelines.

A maintenance mode message may be emitted by this package on startup. To suppress this message, use an environment variable:

AWS_SDK_JS_SUPPRESS_MAINTENANCE_MODE_MESSAGE=1 node my_program.js

or a JavaScript setting as follows:

var SDK = require('aws-sdk');
require('aws-sdk/lib/maintenance_mode_message').suppress = true;

Table of Contents:

Getting Started

How To Install

In the Browser

To use the SDK in the browser, simply add the following script tag to your HTML pages:

<script src="https://sdk.amazonaws.com/js/aws-sdk-2.1603.0.min.js"></script>

You can also build a custom browser SDK with your specified set of AWS services. This can allow you to reduce the SDK's size, specify different API versions of services, or use AWS services that don't currently support CORS if you are working in an environment that does not enforce CORS. To get started:

http://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/building-sdk-for-browsers.html

The AWS SDK is also compatible with browserify.

For browser-based web, mobile and hybrid apps, you can use AWS Amplify Library which extends the AWS SDK and provides an easier and declarative interface.

In Node.js

The preferred way to install the AWS SDK for Node.js is to use the npm package manager for Node.js. Simply type the following into a terminal window:

npm install aws-sdk

In React Native

To use the SDK in a react native project, first install the SDK using npm:

npm install aws-sdk

Then within your application, you can reference the react native compatible version of the SDK with the following:

var AWS = require('aws-sdk/dist/aws-sdk-react-native');

Alternatively, you can use AWS Amplify Library which extends AWS SDK and provides React Native UI components and CLI support to work with AWS services.

Using Bower

You can also use Bower to install the SDK by typing the following into a terminal window:

bower install aws-sdk-js

Usage with TypeScript

The AWS SDK for JavaScript bundles TypeScript definition files for use in TypeScript projects and to support tools that can read .d.ts files. Our goal is to keep these TypeScript definition files updated with each release for any public api.

Pre-requisites

Before you can begin using these TypeScript definitions with your project, you need to make sure your project meets a few of these requirements:

  • Use latest version of TypeScript. We recommend 4.x+

  • Includes the TypeScript definitions for node. You can use npm to install this by typing the following into a terminal window:

    npm install --save-dev @types/node
  • If you are targeting at es5 or older ECMA standards, your tsconfig.json has to include 'es5' and 'es2015.promise' under compilerOptions.lib. See tsconfig.json for an example.

In the Browser

To use the TypeScript definition files with the global AWS object in a front-end project, add the following line to the top of your JavaScript file:

/// <reference types="aws-sdk" />

This will provide support for the global AWS object.

In Node.js

To use the TypeScript definition files within a Node.js project, simply import aws-sdk as you normally would.

In a TypeScript file:

// import entire SDK
import AWS from 'aws-sdk';
// import AWS object without services
import AWS from 'aws-sdk/global';
// import individual service
import S3 from 'aws-sdk/clients/s3';

NOTE: You need to add "esModuleInterop": true to compilerOptions of your tsconfig.json. If not possible, use like import * as AWS from 'aws-sdk'.

In a JavaScript file:

// import entire SDK
var AWS = require('aws-sdk');
// import AWS object without services
var AWS = require('aws-sdk/global');
// import individual service
var S3 = require('aws-sdk/clients/s3');

With React

To create React applications with AWS SDK, you can use AWS Amplify Library which provides React components and CLI support to work with AWS services.

With Angular

Due to the SDK's reliance on node.js typings, you may encounter compilation issues when using the typings provided by the SDK in an Angular project created using the Angular CLI.

To resolve these issues, either add "types": ["node"] to the project's tsconfig.app.json file, or remove the "types" field entirely.

AWS Amplify Library provides Angular components and CLI support to work with AWS services.

Known Limitations

There are a few known limitations with the bundled TypeScript definitions at this time:

  • Service client typings reflect the latest apiVersion, regardless of which apiVersion is specified when creating a client.
  • Service-bound parameters use the any type.

Getting Help

The best way to interact with our team is through GitHub. You can open an issue and choose from one of our templates for bug reports, feature requests or guidance. You may also find help on community resources such as StackOverFlow with the tag #aws-sdk-js. If you have a support plan with AWS Support, you can also create a new support case.

Please make sure to check out our resources too before opening an issue:

Please see SERVICES.md for a list of supported services.

Maintenance and support for SDK major versions

For information about maintenance and support for SDK major versions and their underlying dependencies, see the following in the AWS SDKs and Tools Shared Configuration and Credentials Reference Guide:

Contributing

We welcome community contributions and pull requests. See CONTRIBUTING.md for information on how to set up a development environment and submit code.

License

This SDK is distributed under the Apache License, Version 2.0, see LICENSE.txt and NOTICE.txt for more information.

aws-sdk-js's People

Contributors

adityamanohar avatar ajredniwja avatar alexforsyth avatar allanzhengyp avatar awssteveha avatar bkw avatar carljparker avatar chrisradek avatar comcalvi avatar dependabot[bot] avatar eddy-aws avatar guymguym avatar islam-taha avatar jeskew avatar jpb avatar jstewmon avatar kellertk avatar kuhe avatar liujoycec avatar lsegal avatar maghis avatar mdurrant avatar mtdowling avatar rclark avatar rlovelett avatar siddsriv avatar srchase avatar trevorrowe avatar trivikr avatar workeitel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-sdk-js's Issues

Add Buffer and Stream support for S3

Would be nice to be able to pass a buffer, or stream in addition to a base-64 encoded string. Since these are the typical interfaces for dealing with binary data in Node.js

Naming issues

Hi,

in all Amazon documentations, the credentials are AWSAccessKeyID and AWSSecretKey. I think the node API should use the same, not accessKeyId and secretAccessKey.

Patrick

SimpleDB select - Cannot read property 'n' of undefined

I can successfully call listDomains, however this pruned select program ...

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./awscredentials.json');
console.log(AWS.VERSION);

var db = new AWS.SimpleDB();
var params = {SelectExpress: "SELECT * FROM ken"};
db.client.select(params, function(error,data) {
  if (error) {
    console.log(error);
  } else {
    console.log(data);
  }
});

yields ...

v0.9.5-pre.6
{ [TypeError: Cannot read property 'n' of undefined] statusCode: undefined, retryable: false }

copyObject not working when replacing metadata

When trying to use s3.client.copyObject with MetadataDirective: 'REPLACE' I don't see standard metadata headers such as CacheControl or ContentType being included in the request. Additionally the request fails with:

<Error><Code>NotImplemented</Code><Message>A header you provided implies functionality that is not implemented</Message><Header>Transfer-Encoding</Header><RequestId>E45E0F55BA68E46D</RequestId><HostId>...</HostId></Error>

Indicating that the PUT request is including a header I didn't specify causing it to be rejected with a status code of 501. Seems like a variation of: https://forums.aws.amazon.com/message.jspa?messageID=191792

Dynamodb: "Supplied AttributeValue is empty" error despite sending a seemingly correct request

The following is the params structure that I'm submitted to the updateItem call for dynamodb. Am I making a formatting error in my request or is something amiss with the client library?

{
  "TableName": "my_table_name",
  "Key": {
    "HashKeyElement": {
      "u:c": {
        "S": "5:15"
      }
    }
  },
  "AttributeUpdates": {
    "4": {
      "Value": {
        "S": "a string of text"
      },
      "Action": "ADD"
    }
  }
}
{
  "code": "ValidationException",
  "message": "Supplied AttributeValue is empty, must contain exactly one of the supported datatypes",
  "statusCode": 400,
  "retryable": false
}

I also submitted my question here: https://forums.aws.amazon.com/thread.jspa?threadID=114724&tstart=0

Support for CloudFormation and CloudFront

I'm starting some work on a node.js application that will need to integrate with AWS. All the services are supported except CloudFormation and CloudFront. Any idea when those would be supported?

Add timeout setting per API call

It should be possible to add timeout settings per API call. Please also document how to best handle timeouts globally for all AWS SDK API calls.

describeInstances with filter

I cannot get a valid return from describeInstances when I try to apply a Filter to it.

For example

var request = ec2.describeInstances();

request.params = {
    Filters: [{Name: "image-id", Values: ["ami-xxxxx"]}]
};

request.
    on('success', function(response) {
        console.log("Response", response.data);
    }).
    on('error', function(response) {
        console.log("err", response)
    }).
    on('complete', function() {
        console.log("Always!");
    }).send();

Output:

{ [UnknownParameter: The parameter Item is not recognized]
  message: 'The parameter Item is not recognized',
  code: 'UnknownParameter',
  name: 'UnknownParameter',
  statusCode: 400,
  retryable: false }

This was after upgrading to v0.9.2-pre.3

Error for missing credentials or region is not obvious

When a request is made and the configuration does not include credentials or region information, the request can fail signing with an extremely uninformative error message. The signing method should check that credentials/region data is set and issue a fail event if any of the values are missing.

Getting SerializationException

This is my code:

    AWS.config.update({
        region: "us-west-1",
        accessKeyId: module.config.aws.accessKeyId,
        secretAccessKey: module.config.aws.secretAccessKey,
        s3ForcePathStyle: true
    });

    var dynamo = new AWS.DynamoDB.Client();

    dynamo.getItem({
        TableName: "table-name",
        Key: {
            HashKeyElement: "test-item"
        }
    }, function(err, data) {

When I run this I get:

{ code: 'SerializationException',
  message: 'Expected null',
  statusCode: 400,
  retryable: false }

What am I doing wrong?

Is there working examples somewhere on how to use the dynamo db API?

Expose original error object in promises (or callbacks)

The current error structure in the promises implementation only supports a very simple message, so for situations where an error is thrown from the node.js stack for example, the error is converted into a string and all stack information etc is lost:

callbacks.onError({ code: 'NetworkingError', message: e.toString() });

It would be nice if the original error object could be added to the error structure so that it can be inspected, etc.

Edit: if node.js-style callbacks are going to be used (fingers crossed), then there shouldn't be any need for this error structure at all - just use/expose JS errors/exceptions.

SQS: listQueues only lists one queue

Here's some sample code (same issue if QueueNamePrefix is specified)

var AWS = require('aws-sdk');
AWS.config.loadFromPath("./credentials.json");
var sqs = new AWS.SQS();

sqs.client.listQueues({ } , function(error, data) {
console.log(JSON.stringify(error));
console.log(JSON.stringify(data));
});

Signature error SimpleDB Select

When trying to do a SimpleDB Select query I'm getting a "SignatureDoesNotMatch" error.


var svc = new AWS.SimpleDB();

var params = {
    SelectExpression : "SELECT * FROM mysimpledbdomain" 
};

svc.client.select(params,function(err,data){    
    if(err)
    {       
        console.log(err);
    }else{      
        console.log(data);      
    }
});

Console output

{ [SignatureDoesNotMatch: The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Co
nsult the service documentation for details.]
  message: 'The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the serv
ice documentation for details.',
  code: 'SignatureDoesNotMatch',
  name: 'SignatureDoesNotMatch',
  statusCode: 403,
  retryable: false }

NotImplemented - A header you provided implies functionality that is not implemented

I am consistently receiving a NotImplemented error when trying to write a simple text file to S3 using the sdk.

var AWS = require( 'aws-sdk' );

AWS.config.update( { 
  accessKeyId : 'ACCESS_KEY_ID', 
  secretAccessKey : 'SECRET_ACCESS_KEY' 
});

var s3 = new AWS.S3({ endpoint : 'https://s3.amazonaws.com' });

s3.client.putObject({
  ACL : 'private',
  Body : 'some string',
  Bucket : 'my-bucket',
  Key : 'some/path/here/stuff.txt',
  ServerSideEncryption : 'AES256'
}, function ( error, response ) {
 // error listed below
});

results in the following error:

{
  "code":"NotImplemented",
  "message":"A header you provided implies functionality that is not implemented",
  "statusCode":501,
  "retryable":true
}

which, in itself seems weird considering it is NotImplemented but is retryable.

doen't support CJK characters

reason:

uriEscape function in util.js is using "escape", which cannot handle non-ascii characters.
should be using encodeURIComponent
http://stackoverflow.com/questions/75980/best-practice-escape-or-encodeuri-encodeuricomponent

repro s3 structure:

Bucket:
photo

Objects:
2011_09_18_西安/1.jpg
2011_09_18_西安/2.jpg

repro code:

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./config.json');
AWS.config.update({region: 'us-east-1'});
var s3 = new AWS.S3();
s3.client.listObjects({
  'Bucket': 'photo',
  'Delimiter': '/',
  'Prefix': '2011_09_18_西安/'
}).done(function(response){
  console.log(response.data);
}).send();

S3.client.deleteObject throwing circular JSON structure error

I'm running into a problem with deleteObject and deleteObjects, both of which appear to be creating JSON payloads that cannot be parsed by Object.stringify.

This is a Meteor project. putObject works great and I'm not seeing any other problems with the SDK.

The following code...

    console.log
      Bucket: params.aws.s3.bucketName
      Key: asset.s3.key
    s3.client.deleteObject
      Bucket: params.aws.s3.bucketName
      Key: asset.s3.key
    , callback

... throws this error

{ Bucket: 'assets.christopheresplin.com',
  Key: 'meteor/blog/assets/1356939715328_File0037.jpg' }
TypeError: Converting circular structure to JSON
    at Object.stringify (native)
    at _.extend.send (app/packages/livedata/livedata_server.js:131:29)
    at _.extend.protocol_handlers.method (app/packages/livedata/livedata_server.js:324:12)
    at processNext (app/packages/livedata/livedata_server.js:188:43)
Exited with code: 1

examples for s3 not working

http://docs.amazonwebservices.com/nodejs/latest/dg/nodejs-dg-examples.html

As with the other dynamo example, just need to remove resp here

var s3 = new AWS.S3();
s3.client.listBuckets(function(err, data) {
  for (var index in resp.data.Buckets) {
    var bucket = resp.data.Buckets[index];
    console.log("Bucket: ", bucket.Name, ' : ', bucket.CreationDate);
  }
});

Had to hunt through this repo to see putObject takes params, callback

var s3 = new AWS.S3();
s3.client.createBucket({Bucket: 'myBucket'}, function(err, data) {
  var data = {Bucket: 'myBucket', Key: 'myKey', Body: 'Hello!'};
  s3.client.putObject(data).done(function(resp) {
    console.log("Successfully uploaded data to myBucket/myKey");
  });
});

so it should be

var s3 = new AWS.S3();
s3.client.createBucket({Bucket: 'myBucket'}, function(err, data) {
  var data = {Bucket: 'myBucket', Key: 'myKey', Body: 'Hello!'};
  s3.client.putObject(data, function(err,data) {
    if(err) return console.log(err);
    console.log("Successfully uploaded data to myBucket/myKey");
  });
});

How to retrieve the Metadata from getObject callback data?

I am trying to upload/download an audio chunk file to/from S3 using AWS node SDK. I have tried base64 approach and it works fine. But I am not able to get the Metadata back which I have bundled as part of upload params

Below is the code snippet for upload along with meta info:

var myMetaInfo = "AdditionalInfo", dataToUpload = {Bucket: bucketName, Key:storageFolderFullPath, 
    Body: myAudioFile.toString('base64'), Metadata: {metaInfo: myMetaInfo}};
s3.client.putObject(dataToUpload, function(err, data) {
    if (!err) {
        console.log("Successfully uploaded the file to ::" + dataToUpload.Bucket);            
    } else {
        console.log(" **** ERROR while uploading ::"+err);            
    }        
}); 

And this is the snippet for downloading the file. Metadata is not part of the callback data. I tried printing the callback 'data' to console and noticed that only the following params are available LastModified, ContentType, ContentLength, ETag, Body, RequestId

var dataToDownload = {Bucket: bucketName, Key: storageFolderFullPath}, originalFile, myMetaInfo;
s3.client.getObject(dataToDownload, function(err, data) {
    if (!err) {            
        originalFile = new Buffer(data.Body, 'base64');
        myMetaInfo = data.Metadata.metaInfo; // Metadata is coming as undefined
        console.log(" Meta info:: " + myMetaInfo);
        fs.writeFile(fileStoragePath, originalFile, function(err) {
            if (!err) {
                console.log(" File written!! ");
            } else {
                console.log(" Error while writing the file !!" + err);
            }
        });
    } else {
        console.log(" **** ERROR while downloading ::"+err);            
    }
});

Is this a bug or something wrong with my implementation?

SQS: deleteMessageBatch doesn't work

sqs.client.receiveMessage({
QueueUrl: "https://sqs.us-east-1.amazonaws.com/863688628359/Build-Output",
MaxNumberOfMessages: 10
}, function (err, data) {
console.log('receive message: ' + JSON.stringify(err));
console.log('receive message: ' + JSON.stringify(data));

sqs.client.deleteMessageBatch({
    QueueUrl: "https://sqs.us-east-1.amazonaws.com/863688628359/Build-Output",
    DeleteMessageBatchRequestEntry: [ { Id: "foo", ReceiptHandle: data.Messages[0].ReceiptHandle } ]
}, function (err, data) {
    console.log('deleteMessageBatch ' + JSON.stringify(err));
    console.log('deleteMessageBatch ' + JSON.stringify(data));
});

Two bugs:

  1. The doc makes no mention of DeleteMessageBatchRequestEntry
  2. It doesn't work. All I get is {"retryable":false} as the error.

AWS.util.arrayEach uses for...in which loops through Array.prototype extensions

This causes issues for any extensions to Array.prototype, should be using a FOR LOOP to loop through the array.

has no method 'toLowerCase'
at SigVS3.AWS.SigVS3.inherit.canonicalizedAmzHeaders (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/sigvs3.js:108:23)
at SigVS3.arrayEach (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/util.js:153:30)
at SigVS3.canonicalizedAmzHeaders (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/sigvs3.js:107:24)
at SigVS3.stringToSign (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/sigvs3.js:85:24)
at SigVS3.addAuthorization (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/sigvs3.js:67:65)
at HttpRequest.sign (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/client.js:85:14)
at RequestHandler.makeRequest (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/http.js:166:17)
at AWSRequest.send (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/promise.js:110:34)
at S3Client.makeRequest (/Users/corysmith/Apps/beaucoo-server/node_modules/aws-sdk/lib/client.js:47:15)
at S3Client.AWS.util.update.defineMethods.AWS.util.each.svc.(anonymous function) as copyObject

NetworkingError: getaddrinfo ENOTFOUND

Hi!
I try to putObject to my s3 bucket

my accessKeyId & secretAccessKey are right

but I keep getting this error

{ [NetworkingError: getaddrinfo ENOTFOUND]
code: 'NetworkingError',
errno: 'ENOTFOUND',
syscall: 'getaddrinfo',
retryable: true,
name: 'NetworkingError',
statusCode: undefined }

and I have no idea how to solve it

here's my code, simply try to putObject

 async.waterfall([
    ...
    //other functions

    function readFile(sourceFile, callback){
        fs.readFile(sourceFile, function (err, data) {
            if (err) return callback(err);
            callback(null, data);
        });
    },

    function uploadToS3(data, callback){
        var s3 = new AWS.S3();
        s3.client.putObject({
                Bucket: 'my-bucket',
                Key: 'thumb/' + img_name,
                Body: data
            }, function (err, data) {
                if(err) return callback(err);
                console.log('Successfully uploaded file.');
                callback();
            })
    }

thanks!

runInstances returns just one instance on callback

from:
https://forums.aws.amazon.com/thread.jspa?threadID=112023&tstart=0

Hi,

This code creates correctly a few EC2 instances, but returns only one instance (on variable "data"), alghout I can see there are more instances on "response" variable and on my AWS console.

AWS = require('aws-sdk')
util = require('util')

AWS.config.loadFromPath 'aws_credentials.json'

ec2Params =
  ImageId: "ami-xxxxx"
  MinCount: "2"
  MaxCount: "2"
  InstanceType: "t1.micro"
  InstanceInitiatedShutdownBehavior: "terminate"

svc = new AWS.EC2()
svc.client.runInstances ec2Params, (err, data) ->
  if err
    console.log "runInstances - ERROR:", err
    return

  console.log util.inspect(data, false, 6, false)

.done (response) ->

  console.log util.inspect(response, false, 6, true)

Is that a bug?

headObject does not pass custom metadata to callback

I have an object stored on S3 that includes Cache-Control metadata. However when requesting the object via s3.client.headObject that metadata is not passed to the callback. While debugging I see in http.js that the raw httpResponse.headers includes the metadata but this is not relayed to the callback.

ec2.client.describeInstances with filters errors

I am trying to execute following:

ec2.client.describeInstances({ Filters: [ { Name: "vpc-id", Value: [ vpcId ] } ] }, function(error, result) {
  console.log( error );
  console.log( result );
});

It always gives this error:

{ [TypeError: Cannot read property 'n' of undefined] statusCode: undefined, retryable: false }

Am I doing something wrong or is it a bug?

Problems downloading a binary file using getObject

Quoting from https://forums.aws.amazon.com/thread.jspa?messageID=416336&#416336

I'm having a problem downloading a file using the node.js aws-sdk. The act of getting the file is successful, and the callback returns the correct size of the file. But when saved, the content, appears to be invalid. I tried using buffers to save it in utf8, binary, base64, and still nothing.

var svc = new AWS.S3();
var file = "some_binary_file";
svc.client.getObject({Bucket:"myBucket", Key:file}, function(err, data)
{
    if(err == null)
    {
        var buff = new Buffer(data.Body, "binary"); //i've tried everything.
        var fd = fs.openSync("some_local_binary_file", "w");
        fs.writeSync(fd, buff, 0, buff.length,0);
    }

});

StartTime parameter off by 1 month in AWS.EC2.describeSnapshots

All EBS snapshot are listed as starting 1 month after they actually happened. So an EBS snapshot with a "Started" value of "2013-01-06 14:00 GMT" will be listed as "StartTime: Wed Feb 06 2013 14:00:27 GMT+0000 (GMT)". This is probably because JavaScript months are 0-based, not 1-based as the sdk code expects.

I don't know if this behavior is common to the entire sdk or just the snapshots methods.

S3.putBucketWebsite() doesn't support RoutingRules

S3.getBucketWebsite() returns any existing routing rules (although it's undocumented); however, if I pass routing rules to S3.putBucketWebsite(), they're aren't passed along in the HTTP request, so any existing rules are wiped out.

DynamoDB error

When using dynamodb, I frequently get my callback called with both the error and the data set. Is this normal?

The error is always some crc32 check error. I initially thought this was some problem on the server but it's recurrent for the past 3 days and I am beginning to now suspect aws API.

error:{"message":"CRC32 integrity check failed","code":"CRC32CheckFailed","retryable":true,"name":"CRC32CheckFailed","statusCode":200}
data:{"Table":{"CreationDateTime":1357793634.803,"ItemCount":0,"KeySchema":{"HashKeyElement":{"AttributeName":"token","AttributeType":"S"}},"ProvisionedThroughput":{"NumberOfDecreasesToday":0,"ReadCapacityUnits":5,"WriteCapacityUnits":5},"TableName":"tokens","TableSizeBytes":0,"TableStatus":"ACTIVE"}}

Proxy support

Allow the API to be used with HTTP and Socks Proxies.

putItem returns SerializationException when string contains accented UTF8 character

The following code should work according to the documentation found here, but returns this response:

{ code: 'SerializationException',
  message: null,
  statusCode: 400,
  retryable: false }

The code that generated the error was a simple case of attempting to put an item containing the string "é". Replacing "é" with "e" in the following code results in { ConsumedCapacityUnits: 1 }:

var AWS = require('aws-sdk');
var credentials = require([removed]);

AWS.config.update({
        accessKeyId:credentials.aws.primary.id,
        secretAccessKey:credentials.aws.primary.key
});

AWS.config.update({
        region: 'us-east-1'
});

var db = new AWS.DynamoDB();

db.client.putItem({
        TableName: "tableName",
        Item: {
                "id":{"N":"1234"},
                "name":{"S":"é"}
        }
}, function(err, data){
        if(err) return console.log(err);
        console.log(data);
});

Couldn't install on windows

When I try to install sdk on windows and run "npm install aws-sdk -g" I get the following error:

npm http GET https://registry.npmjs.org/aws-sdk
npm http 304 https://registry.npmjs.org/aws-sdk
npm http GET https://registry.npmjs.org/xml2js
npm http GET https://registry.npmjs.org/xmlbuilder
npm http GET https://registry.npmjs.org/libxmljs
npm http 304 https://registry.npmjs.org/xml2js
npm http 304 https://registry.npmjs.org/libxmljs
npm http 304 https://registry.npmjs.org/xmlbuilder
npm http GET https://registry.npmjs.org/sax
npm http GET https://registry.npmjs.org/bindings/1.0.0
npm http 304 https://registry.npmjs.org/sax
npm http 304 https://registry.npmjs.org/bindings/1.0.0

> [email protected] install C:\Users\vlamic\AppData\Roaming\npm\node_modules\aws-sd
k\node_modules\libxmljs
> node-gyp rebuild


C:\Users\vlamic\AppData\Roaming\npm\node_modules\aws-sdk\node_modules\libxmljs>n
ode "C:\Program Files\nodejs\node_modules\npm\bin\node-gyp-bin\\..\..\node_modul
es\node-gyp\bin\node-gyp.js" rebuild
'xml2-config' is not recognized as an internal or external command,
operable program or batch file.
Traceback (most recent call last):
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\gyp", line 18, in <module>
    sys.exit(gyp.main(sys.argv[1:]))
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\__init__.py", line
511, in main
    return gyp_main(args)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\__init__.py", line
494, in gyp_main
    options.circular_check)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\__init__.py", line
133, in Load
    depth, generator_input_info, check, circular_check)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 237
8, in Load
    depth, check)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 384
, in LoadTargetBuildFile
    build_file_data, PHASE_EARLY, variables, build_file_path)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 105
3, in ProcessVariablesAndConditionsInDict
    build_file)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 106
8, in ProcessVariablesAndConditionsInList
    ProcessVariablesAndConditionsInDict(item, phase, variables, build_file)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 102
7, in ProcessVariablesAndConditionsInDict
    ProcessConditionsInDict(the_dict, phase, variables, build_file)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 904
, in ProcessConditionsInDict
    variables, build_file)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 105
3, in ProcessVariablesAndConditionsInDict
    build_file)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 107
2, in ProcessVariablesAndConditionsInList
    expanded = ExpandVariables(item, phase, variables, build_file)
  File "C:\Users\vlamic\.node-gyp\0.8.16\tools\gyp\pylib\gyp\input.py", line 714
, in ExpandVariables
    (contents, p.returncode))
Exception: Call to 'xml2-config --cflags' returned exit status 1. while trying t
o load binding.gyp
gyp ERR! configure error
gyp ERR! stack Error: `gyp` failed with exit code: 1
gyp ERR! stack     at ChildProcess.onCpExit (C:\Program Files\nodejs\node_module
s\npm\node_modules\node-gyp\lib\configure.js:395:16)
gyp ERR! stack     at ChildProcess.EventEmitter.emit (events.js:99:17)
gyp ERR! stack     at Process._handle.onexit (child_process.js:678:10)
gyp ERR! System Windows_NT 6.1.7601
gyp ERR! command "node" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modu
les\\node-gyp\\bin\\node-gyp.js" "rebuild"
gyp ERR! cwd C:\Users\vlamic\AppData\Roaming\npm\node_modules\aws-sdk\node_modul
es\libxmljs
gyp ERR! node -v v0.8.16
gyp ERR! node-gyp -v v0.7.3
gyp ERR! not ok
npm WARN optional dep failed, continuing [email protected]
[email protected] C:\Users\vlamic\AppData\Roaming\npm\node_modules\aws-sdk
├── [email protected]
└── [email protected] ([email protected])

S3 API - Add/Expose Content-MD5 in the modeling

The Content-MD5 header is not exposed in the AWS-SDK API for S3. Please expose this functionality. Thanks.

Referenced forum discussion:
https://forums.aws.amazon.com/thread.jspa?messageID=416658&#416658

Original question from forum:
Hello AWS community,

What is the best way to verify that my large file being pushed to S3 via AWS-SDK in Node.js is not corrupted on the network in transit? I've implemented logic calling the various commands (abort, create, and complete MultipartUpload, as well as UploadPart), but none of these seem to have an MD5 or hash verification input parameter.

Thanks,
Jeff

example on dynamo not working

Trying to get the example running for dynamo:
db = new AWS.DynamoDB()
# db = new AWS.DynamoDB()
db.client.listTables (err, data) ->
console.log resp.data.TableNames

I'm getting a TypeError. Not a string or buffer

Do you have a working example of a Dynamo Call you can post?

createTag error: UnknownParameter: The parameter Item is not recognized

So here's my code:

        svc.client.createTags({
            Resources: [InstanceId],
            Tags:   [{
                        Key: "Name",
                        Value: "sometext"
                    }]
            }, function(err, data) {
                console.log("---------> In Callback");
                if(err) {
                    console.log("Tagging Instance Error: " +err)
                } else {
                    console.log(data)
                }
        });

Here's the error if you don't use the callback function and just call send with some event listeners on the request object.

{ [UnknownParameter: The parameter Item is not recognized]
  message: 'The parameter Item is not recognized',
  code: 'UnknownParameter',
  name: 'UnknownParameter',
  statusCode: 400,
  retryable: false }

Tried changing just about every element to verify.

TypeError using createTags

Hi,
I'm using node 0.8.7 with your latest aws-sdk form npm (~0.9.1-pre.2),
and I'm getting the following error while trying to tag an instance (I'm able to launch it some lines before).

DEBUG: TypeError: Cannot read property 't' of undefined
    at QueryParamSerializer.serializeMember (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:178:14)
    at Object.AWS.QueryParamSerializer.inherit.serializeList (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:173:12)
    at Object.arrayEach (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/util.js:151:30)
    at QueryParamSerializer.serializeList (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:169:14)
    at QueryParamSerializer.serializeMember (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:181:12)
    at Object.AWS.QueryParamSerializer.inherit.serializeStructure (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:146:12)
    at Object.each (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/util.js:142:32)
    at QueryParamSerializer.serializeStructure (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:143:14)
    at QueryParamSerializer.serialize (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:138:10)
    at EC2Client.buildRequest (/Users/simone/Repos/panda-deploy/node_modules/aws-sdk/lib/query_client.js:46:13)

Here is my code:
I'm using this reference http://docs.amazonwebservices.com/AWSJavaScriptSDK/latest/AWS/EC2/Client.html#createTags-property

    client.createTags({
        Resources: [instance_id],
        Tags: [{
            Key: 'name',
            Value: 'PIPPO'
        }, {
            Key: 'asd',
            Value: 'ALDO'
        }]
    }, function (err) {
        if (err) {
            throw err;
        }
    .....

I can't figure out what's wrong...

s3.client.deleteObjects generates invalid request

var AWS = require('aws-sdk');

AWS.config.loadFromPath("./credentials.json");

var s3 = new AWS.S3();
s3.client.deleteObjects({
Bucket: 'mymug',
Delete: { Objects: [
{ Key: '0bbb8b9b-cd75-4151-bc40-7a18d90d71eb' }
] }
}, function(err, data) {
console.log(JSON.stringify(err));
console.log(JSON.stringify(data));
});

The code above generates the following error:
{"message":"Missing required header for this request: Content-MD5","code":"InvalidRequest","name":"InvalidRequest","statusCode":400,"retryable":false}
null

Support for in-lined and eventemitter style callbacks

While the promises, and event based results are nice... It would be nicer still, if the implementation used the standard interfaces in node...

namely...

someRequest.on('done' ,function(...){})

and

.someRequest(options,function(err, ...) {
  ...
});

There are plenty of utilities and libraries in place that depend on a function(err,...) callback as the last parameter.. such as async for example. Where the callback is expected to take a first parameter that is an error, and others that are to be passed on as parameters.

With options like async.waterfall, this makes utilizing client request chaining for single error handling, and less deeply nested structures.

Also, if you are going to have event handling functions, the event emitter structure is preferred. I know that jQuery/jQueryUI style callbacks are nice, and work in that environment, but the Node.js community has been moving towards the suggested alternatives above. If you implement the three options that may work. The callback could even internally bind as an emitted event, as could the function assignments for event handlers.

Otherwise, it's great to see the effort in releasing these tools, and I hope to see more from this community.

Malformed Range request header produced by SDK

GET and HEAD requests for Objects are creating a malformed Range header, which looks like:

undefined: bytes=1-10

It appears to be due to missing values for the following properties:

AWS.S3.Client.prototype.api.operations.getObject.i.m.Range.n
AWS.S3.Client.prototype.api.operations.headObject.i.m.Range.n

request.complete() never called if error callback throws

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./credentials.json');

var s3 = new AWS.S3();
var request = s3.client.createBucket({Bucket: 'myBucket'});

request.
on('success', function(response) {
console.log("Success!");
}).
on('error', function(response) {
console.log("Error!");
throw new Error('exception');
}).
on('complete', function(response) {
console.log("Always!");
}).
send();

In the code above, the error handler gets called. The complete handler is not executed. If you remove the throw, the complete handler is called.

Is this intentional?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.