Giter VIP home page Giter VIP logo

angular-http-batcher's Introduction

The Angular 2+ version of this project can be found here https://github.com/jonsamwell/ngx-http-batcher

Angular Http Batcher - enabling transparent HTTP batch request with AngularJS

The biggest performance boost you will get with modern single page style apps is to reduce the number of HTTP request you send. This module has been designed to batch http requests to the same endpoint following the http 1.1 batch spec and after the 1.11.0 update it can now support serialising to any number of batch formats and I'm planning to implement that Facebook batch protocol very soon. All you need to do is configure the batch endpoint with the library and the rest is taken care of!

Getting Started

Install the module via bower or download the latest distribution from github.

bower install angular-http-batcher --save-dev 

Include the javascript file in your html.

<script src="bower_components/angular-http-batcher/dist/angular-http-batch.min.js"></script>

Add the module as one of you application's dependencies.

angular.module('myApp', ['jcs.angular-http-batch']);

This module aims to be as transparent as possible. I didn't want to add specific methods to send batch requests manually (although this feature is in the pipeline) as I think this should happen transparently for the developer so you are not tying your application to a specific implementation. So in order for the library to be able to digisuse batchable HTTP request you need to register an endpoint that can accept a HTTP 1.1 batch request.

angular.module('myApp', ['jcs.angular-http-batch']);
   .config([
      'httpBatchConfigProvider',
          function (httpBatchConfigProvider) {
             httpBatchConfigProvider.setAllowedBatchEndpoint(
                     // root endpoint url
                     'http://api.myapp.com',
                     
                     // endpoint batch address
                     'http://api.myapp.com/batch',
                     
                     // optional configuration parameters
                     {
                     	maxBatchedRequestPerCall: 20
                     });
         }
]);

The root endpoint url is simply the base address of your api and the endpoint batch address is the url of the method that can accept the batch request (usually just /batch or /$batch). You are able to pass some optional configuration paramaters to this call in the third argument (see below)

The setAllowedBatchEndpoint has some options that can be passed in as a third parameter to the call which are explained below.

{
	maxBatchedRequestPerCall: 10,
	minimumBatchSize: 2,
	batchRequestCollectionDelay: 100,
	ignoredVerbs: ['head'],
    sendCookies: false,
    enabled: true,
    adapter: 'httpBatchAdapter' //defaults to this value we currently also support a node js multifetch format as well
}

####adapter The key of the adapter to use to serialise/deserialise batch requests. Defaults to the HTTP 1.1 adapter 'httpBatchAdapter'.

Current adapters are:

  1. 'httpBatchAdapter': supports the HTTP 1.1 spec and used by .Net (WebAPI) and JAVA servers.
  2. 'nodeJsMultiFetchAdapter': supports batching GET requests to a node server that uses the multifetch library.

Coming soon:

  1. 'facebookAdapter': will support the facebook batching protocol.

Please request adapters that are not present.

Adapters convert http requests into a single batch request and parse the batch response. They consist of two methods defined below.

This adapter parameter can also be an object with the two below functions if you need to be more specific about the way requests and responses are handled.

   /**
    * Builds the single batch request from the given batch of pending requests.
    * Returns a standard angular httpConfig object that will be use to invoke the $http service.
    * See:
    * https://developers.google.com/storage/docs/json_api/v1/how-tos/batch
    * http://blogs.msdn.com/b/webdev/archive/2013/11/01/introducing-batch-support-in-web-api-and-web-api-odata.aspx
    *
    * @param requests - the collection of pending http request to build into a single http batch request.
    * @param config - the http batch config.
    * @returns {object} - a http config object.
    */
   function buildRequestFn(requests, config) {
     var httpConfig = {
         method: 'POST',
         url: config.batchEndpointUrl,
         cache: false,
         headers: config.batchRequestHeaders || {}
       };

     // do processing...

     return httpConfig;
   }

   /**
    * Parses the raw response into an array of HttpBatchResponseData objects.  If is this methods job
    * to parse the response and match it up with the orginal request object.
    * @param rawResponse
    * @param config
    * @returns {Array.HttpBatchResponseData[]}
    */
   function parseResponseFn(requests, rawResponse, config) {
     var batchResponses = []; // array of HttpBatchResponseData

     //do processing..

     return batchResponses;
   }

####maxBatchedRequestPerCall The maximum number of single http request that are allow to be sent in one http batch request. If this limit is reached the call will be split up into multiple batch requests. This option defaults to 10 request per batch but it is probably worth playing around with this number to see the optimal batch size between total request size and response speed.

####minimumBatchSize The smallest number of individual calls allowed in a batch request. This has a default value of 2 as I think the overhead for sending a single HTTP request wrapped up in a batch request on the server would out wieght the efficency. Therefore if only one request is in the batch that request is allow to continue down the normal $http pipeline.

####ignoredVerbs This is a string array of the HTTP verbs that are not allowed to form part of a batch request. By default HEAD requests will not be batched. If for instance you did not want to batch HEAD and DELETE calls you would pass in this array as an option ['head', 'delete']

####enabled True by default. If this is set to false the batcher will ignore all requests and they will be send as normal single HTTP requests.

####canBatchRequest An optional function which determines if the request can be batched - if present this overrides the default mechanism used by the library. It takes in the url and http method of a pending request and returns true if this request can be batched otherwise false.

For example:

    function(url, method) {
      return url.indexOf('api') > -1 && method.toLowerCase() === 'get';
    }

####batchRequestHeaders

An optional object of header keys and values that will be added to a batched request header's before sending to the server. For instance java servlet <= 3.1 parses multipart requests looking for the Content-Disposition header, expecting all multipart requests to include form data

{ batchRequestHeaders: {'Content-disposition': 'form-data'} }

See notes on running this with java servlet <= 3.1

####batchPartRequestHeaders

An optional object of header keys and values that will be added to each batched request part header's before sending to the server. For instance java servlet <= 3.1 parses multipart requests looking for the Content-Disposition header, expecting all multipart requests to include form data

{ batchPartRequestHeaders: {'Content-disposition': 'form-data'} }

See notes on running this with java servlet <= 3.1

####uniqueRequestName

An optional parameter to set a unique parameter name on the Content-Disposition header. This requires the use of batchPartRequestHeaders sending in a Content-Disposition header. Sample configuration:

  {
    ...
    batchPartRequestHeaders: {'Content-Disposition': 'form-data' },
    uniqueRequestName: "batchRequest"
    ...
  }

Some backend servers may require that each part be named in this manner. If the configuration above is used, then each part will have a header like this: Content-Disposition: form-data; name=batchRequest0

If a Content-Disposition header is not added in the batchPartRequestHeaders then this parameter is silently ignored.

####sendCookies False by default to reduce request size. If this is set to true cookies available on the document.cookie property will be set in each segment of a batch request. Note that only non HTTPOnly cookies will be sent as HTTPOnly cookies cannot be access by JavaScript because of security limitations.

Note that if you are sending CORS request you will have to enable withCredentials on $http to allow cookies to be sent on the XHR request.

    angular.module('myApp').config(['$httpProvider', function($httpProvider) {
        $httpProvider.defaults.withCredentials = true;
    }]);

Also ensure the server responds to the OPTIONS call with the below header:

Access-Control-Allow-Credentials: true

// As an attribute on the controller
[EnableCors("*", "*", "*", SupportsCredentials=true)]

or

// Comples scenario on the config
config.EnableCors();
var defaultPolicyProvider = new EnableCorsAttribute("*", "*", "*");
defaultPolicyProvider.SupportsCredentials = true; //important if you are sending cookies
AttributeBasedPolicyProviderFactory policyProviderFactory = new AttributeBasedPolicyProviderFactory();
policyProviderFactory.DefaultPolicyProvider = defaultPolicyProvider;
config.SetCorsPolicyProviderFactory(policyProviderFactory);

config.Routes.MapHttpRoute(
    name: "BatchApi",
    routeTemplate: "api/batch",
    defaults: null,
    constraints: null,
    handler: new CorsMessageHandler(config) { InnerHandler = new DefaultHttpBatchHandler(GlobalConfiguration.DefaultServer) });

####batchRequestCollectionDelay This is undoubtedly the most important option. As this module tries to be as transparent as possible to the user.

The default time in milliseconds the http batcher should wait to collection all request to this domain after the first http call that can be batched has been collect. This defaults to 100ms. Therefore if you send a HTTP GET call that can be batched the HTTP batcher will receive this call and wait a further 100ms before sending the call in order to wait for other calls to the same domain in order to add them to the current batch request. If no other calls are collected the initial HTTP call will be allowed to continue as normal and will not be batched unless the config property - minimumBatchSize is set to one.

Immediately flushing all pending requests

In some instances you might want to immediately send all pending request regardless of if the request quota or timeout limit has been reached. To do this you can simply call the flush method on the httpBatcher service and optionally pass in the url of the batch endpoint you want to flush (if no parameter is passed in all pending requests to all endpoints are flushed).
angular.module('myApp', ['jcs.angular-http-batch']);
   .run([
      'httpBatcher',
          function (httpBatcher) {
             httpBatcher.flush();
         }
]);

Configuring .Net Web API 2 for Batch Requests

This is really simple the web api team have done a really good job here. To enable batch request handling you just add a new route to your application and the rest is done for you! It's so easy I don't see any reason for you not to do it! See this link for a more detailed setup guide. Just add the below code to your web api configuration class and you are good to go!

configuration.Routes.MapHttpBatchRoute(
        routeName:"batch",
        routeTemplate:"api/batch",
        batchHandler:new DefaultHttpBatchHandler(server));

Configuring for Java Servlet <= 3.1

Java Servlet <= 3.1 parses multipart requests looking for the Content-Disposition header, expecting all multipart requests to include form data. It also expects a content disposition header per request part in the batch.

Therefore you will need to setup the library to do this. Add the below to your config object when initialising the batch endpoint.

{
    batchRequestHeaders: {'Content-disposition': 'form-data'},
    batchPartRequestHeaders: {'Content-disposition': 'form-data'}
}

angular-http-batcher's People

Contributors

jonsamwell avatar magarcia avatar riaann avatar tamlyn avatar tiwariarvin avatar tomyam1-personal avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

angular-http-batcher's Issues

HTTP requests using methods specified as ignoredVersbs are still batched

UPDATE: Sorry, I just realized my own stupidity in that I'm re-writing the canBatchRequestmethod and thus have to handle the ignoredVerbs myself. Sorry for the inconvenience.

Dear Jon,

I’m using the latest version (1.12), and have an issue regarding the ignoredVerbs, which doesn’t seem to work as it should. Even though I have specified multiple ignoredVerbs (see code-snippet below), all requests using the corresponding HTTP-methods are still included in batches.

My configuration is as follows:

httpBatchConfigProvider
    .setAllowedBatchEndpoint(globalConfig.apiRoot, batchPath, {
        maxBatchedRequestPerCall: 25,
        minimumBatchSize: 2,
        batchRequestCollectionDelay: 100,
        ignoredVerbs: ["post", "put", "patch", "delete"],
        sendCookies: false,
        enabled: batchEnabled,
        canBatchRequest: function (url, method) {
            //......
        }
    });

Your input/help would be greatly appreciated.

Thank you in advance.

Kind regards,

Adam

Not able to call Web API 2 Batch Endpoint

I am trying to use http batcher sample code to call batch endpoint on ASP .Net Web API 2, but not able to make the batch calls. I am able to call the same batch endpoint with .Net client code without any issues.
Following is the fiddler trace when calling through http batcher,

Request:

OPTIONS http://localhost:84/api/batch HTTP/1.1
Host: devx.microsoft-tst.com:84
Connection: keep-alive
Access-Control-Request-Method: POST
Origin: http://localhost:2019
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36
Access-Control-Request-Headers: accept, content-type
Accept: /
Referer: http://localhost:2019/Home/Index
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8

Response:

HTTP/1.1 400 Bad Request
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/json; charset=utf-8
Expires: -1
Server: Microsoft-IIS/10.0
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Tue, 05 Jul 2016 14:44:43 GMT
Content-Length: 68

{"Message":"The batch request must have a "Content-Type" header."}

Following is the fiddler trace for .Net client,

POST http://localhost:84/api/batch HTTP/1.1
Content-Type: multipart/mixed; boundary="batch_75e852a8-ae17-4cf7-a601-1e3dbfd47f5c"
Host: localhost:84
Content-Length: 857
Expect: 100-continue
Connection: Keep-Alive

--batch_75e852a8-ae17-4cf7-a601-1e3dbfd47f5c
Content-Type: application/http; msgtype=request

GET /api/Customers HTTP/1.1
Host: localhost:84

--batch_75e852a8-ae17-4cf7-a601-1e3dbfd47f5c
Content-Type: application/http; msgtype=request

POST /api/Customers HTTP/1.1
Host: localhost:84
Content-Type: application/json; charset=utf-8

{"Id":10,"Name":"Name 10"}
--batch_75e852a8-ae17-4cf7-a601-1e3dbfd47f5c
Content-Type: application/http; msgtype=request

PUT /api/Customers/100 HTTP/1.1
Host: localhost:84
Content-Type: application/json; charset=utf-8

{"Id":100,"Name":"Peter"}
--batch_75e852a8-ae17-4cf7-a601-1e3dbfd47f5c
Content-Type: application/http; msgtype=request

DELETE /api/Customers/100 HTTP/1.1
Host: localhost:84

--batch_75e852a8-ae17-4cf7-a601-1e3dbfd47f5c--

Response

HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Length: 795
Content-Type: multipart/mixed; boundary="1c3e08bd-7545-4b3a-bd0d-b291a436a7fb"
Expires: -1
Server: Microsoft-IIS/10.0
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Tue, 05 Jul 2016 12:10:19 GMT

--1c3e08bd-7545-4b3a-bd0d-b291a436a7fb
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

[{"Id":100,"Name":"Venkatesh"}]
--1c3e08bd-7545-4b3a-bd0d-b291a436a7fb
Content-Type: application/http; msgtype=response

HTTP/1.1 201 Created
Location: http://localhost:84/api/Customers
Content-Type: application/json; charset=utf-8

{"Id":100,"Name":"Venkatesh"}
--1c3e08bd-7545-4b3a-bd0d-b291a436a7fb
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

{"Id":100,"Name":"Venkatesh"}
--1c3e08bd-7545-4b3a-bd0d-b291a436a7fb
Content-Type: application/http; msgtype=response

HTTP/1.1 204 No Content

--1c3e08bd-7545-4b3a-bd0d-b291a436a7fb--

Client Cache Support?

I noticed that the auto batcher does not seem to support client cached calls. Is there a plan to implement this in the future?
I was just noodling it over and am not sure what the best method would be.
I don't know how you would tell that a previous request has cache headers set. I've Googled it very quickly and haven't found much.
Any thoughts?

Totally a low priority. I'm currently working on a sprint item for our new website to investigate the caching of ajax requests and noticed they don't work with auto batcher. I'd rather use auto batcher then have them cached so not a super huge deal but would like to have accurate information to get back to the team to. So, let me know what your thoughts are.

Batcher not sending cookies with each request.

We are loving the http batcher but have found that it breaks our login system as it is not passing the cookies along with each header in the POST call to /api/batch.
The main headers for the overall POST have the cookies, but not the "wrapped" ones.
Example:
Here are the actual POST requests headers:

Accept
application/json, text/plain, /
Accept-Encoding
gzip, deflate
Accept-Language
en-us
Content-Length
968
Content-Type
multipart/mixed; charset=UTF-8; boundary=1429288092518
Cookie
EktGUID=cc34ebb5-bde2-4f7c-b53d-586da408456d; EkAnalytics=0; sellang=1033; __utma=267318028.446158527
.1429210407.1429210407.1429222065.2; __utmz=267318028.1429210407.1.1.utmcsr=(direct)|utmccn=(direct)
|utmcmd=(none); _pk_id.2.6041=2cdce7d65e53ff7b.1429210407.2.1429222128.1429210600.; resolution=1920;
BC_BANDWIDTH=1429225576812X2592; ecm=user_id=10040&AuthenticationToken=dbfa7c741fb941f29df99767657582e0
&site_id=/,439052712&userfullname=blakbarn%40selinc.com&displayname=blakbarn%40selinc.com&username=blakbarn
%40selinc.com&new_site=/&unique_id=439052712&editoroptions=contentdesigner&site_preview=0&langvalue=
&isMembershipUser=1&last_login_date=4/17/2015 9:15:21 AM&dm=.ad.selinc.com&DefaultLanguage=1033&NavLanguage
=1033&SiteLanguage=1033&DefaultCurrency=840&SiteCurrency=840&UserCulture=1033&LastValidLanguageID=1033
&ekTimeZone=Pacific Standard Time; currentuser=%7B%22id%22%3A10040%2C%22authenticated%22%3Atrue%2C%22userName
%22%3A%22blakbarn%40selinc.com%22%2C%22authToken%22%3A%22dbfa7c741fb941f29df99767657582e0%22%2C%22groups
%22%3A%5B888888%2C98184%2C98046%2C98037%2C97812%2C97784%2C97713%2C664%2C396%2C384%2C138%5D%2C%22persistent
%22%3Afalse%7D; .ASPXAUTH=F2DC9511D56C1045A6320CEF7DAC67462B8FD161747D8F4A41FDBF023E5F758557FE67F068
C8B1C6139152A2E351FDC01B8517C56BB80DD72EA06452B52FD923B535F9844139C8C656E75FD138DD52E611E10D435F94A9514D8A1B4706FF733FDFBE41AC2C8B45E81220F497311963AA3AC7D6B7EEB645398B26970C7DF7C5BAC6A419D6
; SelincSession=[email protected]
Host
davesargpc.ad.selinc.com
Preview-Mode
false
Referer
https://davesargpc.ad.selinc.com/utils/elmah
User-Agent
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:37.0) Gecko/20100101 Firefox/37.0

And here is the POST data itself, notice no cookies:
--1429288092518
Content-Type: application/http; msgtype=request

GET /api/elmah/hosts/ HTTP/1.1
Host: davesargpc.ad.selinc.com
Accept: application/json, text/plain, /
Accept-Language: en-us
Preview-Mode: false

--1429288092518
Content-Type: application/http; msgtype=request

GET /api/elmah/types/ HTTP/1.1
Host: davesargpc.ad.selinc.com
Accept: application/json, text/plain, /
Accept-Language: en-us
Preview-Mode: false

--1429288092518
Content-Type: application/http; msgtype=request

GET /api/elmah/statuscodes/ HTTP/1.1
Host: davesargpc.ad.selinc.com
Accept: application/json, text/plain, /
Accept-Language: en-us
Preview-Mode: false

--1429288092518
Content-Type: application/http; msgtype=request

GET /api/elmah/?filterCols=&filters=&page=1&sortCol=&sortDirection=asc HTTP/1.1
Host: davesargpc.ad.selinc.com
Accept: application/json, text/plain, /
Accept-Language: en-us
Preview-Mode: false

--1429288092518--

Is this a bug or a config issue? We are using Web API 2

BatchRequestManager sendFn - $digest error.

I keep getting a digest error when this function runs. I added a timeout to make sure the digest is completed prior to running the callback function. Would you consider adding this timeout to the batcher?

function sendFn() {
  var self = this,
    adapter = self.getAdapter(),
    httpBatchConfig = adapter.buildRequest(self.requests, self.config);

  self.sendCallback();
  self.$injector.get('$http')(httpBatchConfig).then(function (response) {
    var batchResponses = adapter.parseResponse(self.requests, response, self.config);

    //timeout I inserted. 
    self.$timeout(function() {
      angular.forEach(batchResponses, function (batchResponse) {
        batchResponse.request.callback(
          batchResponse.statusCode,
          batchResponse.data,
          convertHeadersToString(batchResponse.headers),
          batchResponse.statusText);
      });
    },0,true);
  }, function (err) {
    angular.forEach(self.requests, function (request) {
      request.callback(err.statusCode, err.data, err.headers, err.statusText);
    });
  });
}

Incorrect call to angular.toJson on request bodies

When batching requests that have request bodies, such as "POST" requests, the standard angular request transforms convert objects into json strings. When angular-http-batcher calls angular.toJson again, we end up with a single json string, instead of a json encoding of the object in the batched request.

So, instead of {"name":"John"} as the body of the request in the batch, you get "{\"name\":\"John\"}" - which causes the parameter binding in Web API to break.

The solution we've found is to remove the angular.toJson call in your HttpBatcher. I'm happy to do a pull request if you prefer.

SyntaxError: Unexpected end of input

Hi,

Thank you for the great module!

I think I've found a possible bug:

SyntaxError: Unexpected end of input
at Object.parse (native)
at Object.fromJson (http://localhost:8080/bower_components/angular/angular.js:1066:14)
at angular.module.factory.convertDataToCorrectType (http://localhost:8080/bower_components/angular-http-batcher/dist/angular-http-batch.js:234:40)
at Object.angular.module.factory.process (http://localhost:8080/bower_components/angular-http-batcher/dist/angular-http-batch.js:297:43)
at http://localhost:8080/bower_components/angular-http-batcher/dist/angular-http-batch.js:377:48
at processQueue (http://localhost:8080/bower_components/angular/angular.js:13175:27)
at http://localhost:8080/bower_components/angular/angular.js:13191:27
at Scope.$get.Scope.$eval (http://localhost:8080/bower_components/angular/angular.js:14388:28)
at Scope.$get.Scope.$digest (http://localhost:8080/bower_components/angular/angular.js:14204:31)
at Scope.$digest (http://localhost:8080/bower_components/ng-stats/dist/ng-stats.js:48:17)

Possible fix:

replace regex template
regex = new RegExp('--.*--', 'i');
with
regex = new RegExp('--[0-9A-F]{8}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{12}--', 'i');

line 289: angular-http-batch.js;

Web Api response

--31fcc127-a593-4e1d-86f3-57e45375848f
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Cache-Control: no-cache
Content-Type: application/json; charset=utf-8

{"inlineCount":35}
--31fcc127-a593-4e1d-86f3-57e45375848f
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Cache-Control: no-cache
Content-Type: application/json; charset=utf-8

{"results":[{BusinessDescription":"Some text here\r--------------------------------------------------------------------------------------------------------------------------\r}],"inlineCount":35}
--31fcc127-a593-4e1d-86f3-57e45375848f--

$http.get leading to "TypeError: Cannot read property 'callback' of undefined"

Apologies in advance if I'm doing something stupid...

Been using http-batch for a while with no issues. Most of my calls to the server go through $resource and that seems to be working fine.

I need to make a call using $http as part of a validator, and for some reason I get:

angular.js:11782 TypeError: Cannot read property 'callback' of undefined
    at angular-http-batch.js:585

My code is as simple as it gets...:

http.get('http://myUrl...').then(function (data) { ... });

So not sure what's causing the issue. I also had the issue in 1.6.0 before upgrading to latest this morning.

I also tried disabled httpBatcher for this call, but setting httpBatcher.enabled = false; before making the call had no effect.

Usage with NodeJS

There are many batch request handlers for NodeJS (e.g. multifetch, batch-endpoint, express-batch) but they batch requests by specifying them in the query string and combining the JSON responses in the body. I can't seem to find any that support multipart/mixed responses. Would you consider adding support for this alternate batching format? Or perhaps provide callbacks to allow customising the batching and unwrapping strategy?

Syntax Error parsing content

Great little utility module - thanks for sharing it!

I am running into some issues handling the content returned.

For example, when an array is returned, it ends up trying to convert "[" to a type and dies with syntax error.

the exception occurs in convertDataToCorrectType, called from line 276:
result.data = convertDataToCorrectType(result.contentType, responsePart);

here is a sample of the returned content in this case;

--79effe20-dcca-43e8-922b-34f7df66446e
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

[
  {
    "caption": "A",
    "items": [
      {
        "a": "b",
        "c": "d"
      },
      {
        "a": "b",
        "c": "d"
      }
    ]
  },
  {
    "caption": "B",
    "items": [
      {
        "a": "b",
        "c": "d"
      },
      {
        "a": "b",
        "c": "d"
      }
    ]
  }
]

It looks like it is expected that all the data is on 1 line but I am running this in debug mode.
I tried to work around this by altering your code a bit

e.g.

                        } else if (result.data === undefined && parsedSpaceBetweenHeadersAndMessage) {
                            var joinedData = responseParts.slice(i, responseParts.length).join('');
                            result.data = convertDataToCorrectType(result.contentType, joinedData );
                            break;
                        }

this works for most cases but the last part will include the multi-part separator so convertDataToCorrectType will fail. I can keep working around the formatting and handle that also but this is starting to get hacky so I thought I'd check to see if there is a better way to handle this or if anyone else has encountered this issue.

Handling server errors

If a server returns a 500 response (and so doesn't provide a response in the expected batched format), it doesn't seem to be getting handled.

This prevents us from doing simple error handling using $resource().$promise.catch()

Am I missing something? (p.s. love this component :))

Update npm package

The npm package version is a bit behind the latest version on github, which has a few bug fixes that I would like to consume. Can you bump the npm version?

setAllowedBatchEndpoint ignores enabled

When sending enabled: false to setAllowedBatchEndpoint, the requests are still being sent via a batched request.

var batchOptions = {
  ignoredVerbs: ['head', 'put', 'post', 'delete'],
  enabled: false
};
batchConfigProvider.setAllowedBatchEndpoint('/api', '/batch', batchOptions);

// batch requests are enabled...

Several differents rootUrls for same app??

Hi,

Sorry if my question si too noobish but I'd like to know if it is possible to specify, for the same app, several different endpoints, and if yes, how do I tell which one I'd like to use, as it seems the config through the "httpBatchConfigProvider" is whole app wide.

Should I specify it by angular.js modules or something ?

Thanks a lot for help!!

Flushing Requests

How does the module handle or support the need to spontaneously send the batch?

For example, on unload we would like to send the most recent batch regardless of quota or time being hit.

OData adapter needed

Hi,
I hope I don't seem too aggressive but I dare to kinda repost my message here, instead as a comment in the issue #2 . This really would be a great addition as a feature, to have an OData adapter


I m gonna need a OData adapter which indeed indents the JSON as we can see here :
Official OData doc point "2.2 Batch Request Body".

You see it does indent a "changeset_< hash>" into a "batch_< hash>" Example:

--batch_36522ad7-fc75-4b56-8c71-56071383e77b
Content-Type: multipart/mixed; boundary=changeset_77162fcd-b8da-41ac-a9f8-9357efbbd621
Content-Length: ###

--changeset_77162fcd-b8da-41ac-a9f8-9357efbbd621
Content-Type: application/http
Content-Transfer-Encoding: binary

POST /service/Customers HTTP/1.1
Host: host

Content-Type: application/atom+xml;type=entry
Content-Length: ###

.... more changesets ....

This probably isn't too hard to write an adapter, but I would like to know if something was already done for such use case with OData @georgeolson @jonsamwell ??

Thanks a lot for your help

Adapter for batch-request

As far as I have been able to find, batch-request is one of the most full-featured batch request libraries available for Node.js.

The format for the request looks like

{
    "myRequest1": {
        "method": "GET",
        "uri": "http://api.mysite.com/users/1/first_name"
    },
    "myRequest2": {
        "method": "GET",
        "dependency": "myRequest1",
        "uri": "http://api.mysite.com/users/1"
    },
    "myRequest3": {
        "method": "GET",
        "uri": "http://api.mysite.com/users/1/company"
    },
}

While the response may look something like

{
    "myRequest1": {
        "statusCode": 200,
        "body": "Victor",
        "headers": {...}
    },
    "myRequest2": {
        "statusCode": 200,
        "body": "[email protected]",
        "headers": {...}
    },
    "myRequest3": {
        "statusCode": 200,
        "body": "SocialRadar",
        "headers": {...}
    },
}

I might work on a solution for this if I get some time.

relative urls don't work

all of our api endpoints are relative, e.g.
var svc = $resource('/api/customers');

the batcher does not like this and ends up mutating calls to:
http://pi/customers
host: pi

I fixed this in the getUrlInfo - we can just use $location.host() and $location.provider() and the url coming in is already the relativeurl.

                    getUrlInfo = function (url) {

                        return {
                            protocol: constants.protocol,
                            host: constants.host, 
                            relativeUrl: url 
                        };

                    }

and now this works with this configuration:

httpBatchConfigProvider.setAllowedBatchEndpoint('/api', '/api/batch');

It would probably be better to make this a configuration option and/or inspect the url more closely in getUrlInfo() to see if it is already relative in which case to use the $location protocol/host.

Batch request (http multipart) response structure

Hi

We want to start and use your framework.
Our server side is Java (tomcat), so we need to write ourselves the processing of the multipart request, and assemble the aggregated response to be delivered back to the client.
I couldn't find on your documentation what should be the structure of the response.
I have searched and found these two links describing how a multipart response should look alike, but then when comparing them to your code that processes the batch response, there are some inconsistencies:
https://www.w3.org/Protocols/rfc2616/rfc2616-sec19.html
https://tools.ietf.org/id/draft-snell-http-batch-00.html#http-batch

After investigating the processResponse method, I came to the below structure:
--[boundry]
content-type: application/ (must be first among all headers, otherwise headers will be omitted)
[headers]..[headers]
[status (in the form of version code text)]
[empty line to separate between the metadata and the actual response data]
[the actual response data, which is carried until the next --boundry appears]

In addition, it seems that there is an assumption that the order of the responses is the same as the order of the requests (so the correlation between the response and request comes from there matching order).

Finally, I saw that the parsing of the date header, if such exists, omits the time and timezone fragments of it (you are treating it as a regular header and takes hardcoded the first part of it and only it).

My question is:
Am I correct in my analysis ? did I miss something or got confused with anything ? are there any other limitation I need to take in mind in order to have a proper structure for the response ?
Do you think you can add up a complete formal structure of the response to the documentation ?

Exception when using with cache:true

I found that plugin throws exception when using with cache:true definition like so:
'getAll': { method: "GET", cache:true, isArray:true },
it's appears angular unable to parse headers when cache enabled,
please check it out.

NPM module

Hi,

This is a very useful module. Do you have any plans to release it as an npm module? My team and I build our code using CommonJS syntax, and currently have to shim this into our project.

Thanks!

adapter for grape-batch request

We are using grape-batch to handle batch requests in the server.

a POST http request is expected with a body like:

{
  requests: 
    [
      {
        method: 'GET', 
        path: '/api/v1/users'
      },
      {
        method: 'POST', 
        path: '/api/v1/login',
        body: { token: 'nrg55xwrd45' }
      }
    ]
}

Absolute URL in config does not work

I blew about three hours trying to figure out why the batcher wasn't working after first installing it in our code base. I followed the initial setup to a tee and confirmed the Web API was working as expected. I ended up creating a whole separate Angular app to test it and still couldn't get it to work until I switched the absolute URL's shown in the demo to relative ones. Once I did that, it began to work.
Does not work:

    angular.module('app').config(['httpBatchConfigProvider',
        function (httpBatchConfigProvider) {
            httpBatchConfigProvider.setAllowedBatchEndpoint(
                // root endpoint url
                'https://url.whatever.com/api',
                // endpoint batch address
                'https://url.whatever.com/api/batch',

                // optional configuration parameters
                {
                    maxBatchedRequestPerCall: 30
                }
            );
        }
    ]);

Works:

    angular.module('app').config(['httpBatchConfigProvider',
        function (httpBatchConfigProvider) {
            httpBatchConfigProvider.setAllowedBatchEndpoint(
                // root endpoint url
                '/api',
                // endpoint batch address
                '/api/batch',

                // optional configuration parameters
                {
                    maxBatchedRequestPerCall: 30
                }
            );
        }
    ]);

I'm totally OK with the relative paths, it's really what I wanted anyway but was trying to replicate the exact setup as shown in the instructions to get it up and running.
I don't know if this is a bug in the code base or not, but changing the instructions would be very helpful.

Thanks!

Easy way to disable batcher for development and testing

I've ran into a case where the batcher is breaking our login checking system. As part of troubleshooting this I wanted to shut off the batcher but can't find an easy of doing so (without commenting out the main js file, the config js file and the line where we inject it into the app, which is kind of painful.
Is there a way to do this in the config? I tried setting maxBatchedRequestPerCall to 1 but that didn't do it.
If not, can we get a config option that would completely turn off batching for testing?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.