kwhitley / apicache Goto Github PK
View Code? Open in Web Editor NEWSimple API-caching middleware for Express/Node.
License: MIT License
Simple API-caching middleware for Express/Node.
License: MIT License
It is hard to follow the changes, without any milestones or release notes.
Also you should not use "intern" versions and then dont release them and jump directly to for example 0.7.0.
With the memory store, this code works:
const apicache = require('apicache');
const cache = apicache.middleware;
router.get('/someroute', cache('1 hour'), (req, res) => {
req.apicacheGroup = 'cachekey'; // Set cache
// ...
});
router.post('/someroute/', (req, res) => {
apicache.clear('cachekey'); // Clear cache
// ...
});
But if I set the redisClient option, the apicache.clear('cachekey')
function does not clear the cache as expected:
const redis = require('redis').createClient();
apicache.options({redisClient: redis}); // This breaks
Huge request to you guys out there, now that this actually has enough of a user-base to help support it:
I would LOVE someone to write a test suite for this (using whatever flavor they prefer, although I've always preferred mocha
myself).
...
For those that don't know, I wrote this library years ago to support my own blog engine that sported an (obviously) very simple JSON API. Since then I've been torn a million ways, both in my role as a developer (which took me overseas to Kuwait for the last two years), as well as my passion (and future career) in wilderness fine art photography. I appreciate each and every issue, PR, etc that everyone has brought up, and offer my sincerest apologies for not being able to respond in anything even close to a timely manner. I'm trying to update everything right now, merge PRs, update docs, and ultimately tag a new release. If anyone would like to take on a more active role as contributor to this library, updating it and expanding features (while preserving the simplicity around which it was first designed), let's chat!
Cheers,
Kevin
http://krwhitley.com
https://www.facebook.com/kevin.r.whitley
I was thinking, it may be a good idea to add support for binary or other types, rather then only JSON.
Redis claims to have good support for binary types, (see http://redis.io/topics/data-types) so that would mean that endocding/decoding would be unnecessary
We would also need a mechanism to store the mime-type (edit:) / Content-Type
header
globalOptions.redisClient.del(); fails with:
ReplyError: ERR wrong number of arguments for 'del' command
at parseError (/home/fiouch/diplomski-fvoska/node_modules/redis-parser/lib/parser.js:181:12)
at parseType (/home/fiouch/diplomski-fvoska/node_modules/redis-parser/lib/parser.js:269:14)
Redis DEL command requires parameters, as per documentation.
Not sure if this has changed over time as I have only recently started using redis.
I temporarily fixed it on my end by replacing globalOptions.redisClient.del(); with globalOptions.redisClient.flushall(); But this flushes whole redis, not only apicache items. DEL should get all keys used in apicache (maybe get from index.all?) as parameters.
In my testing I am finding that the index object and the cache groups (apicacheGroup) are both only populated in debug mode.
Is this intentional? It seems to me that the console.log should be behind the 'if debug mode', but not the other code in that section. Perhaps the two 'if' statements are the wrong way around?
Either that or I am doing something wrong...
Thanks in advance for any feedback.
Hi, I have the same routes for different companies and users. We get the Company ID (and the user id in other cases) from a security token given on login.
And then we filter by this Company ID taken from a token that comes in a HEADER on the REQUEST.
How could I add that (a mondodb ID)? so something like this 52911feb8f819b571000000b
Is there something like:
import apicache from 'apicache'
let cache = apicache.middleware
app.use(cache('5 minutes'))
// routes are automatically added to index, but may be further added
// to groups for quick deleting of collections
app.get('/api/articles', (req, res) => {
req.apicacheKey = '{5720f4980882091000b66c40}/api/articles'
req.apicacheGroup = req.params.collection
res.json({ success: true })
})
Hi every once in a while my redis server is unresponsive, which generates the below error. Since its not caught and handled it stops express. Can this be caught in this apicache?
events.js:160
throw er; // Unhandled 'error' event
^
Error: Redis connection to someredisserver.com failed - read ECONNRESET
at exports._errnoException (util.js:1026:11)
at TCP.onread (net.js:569:26)
``
When installing apicache
as a dependency via npm:
npm WARN deprecated [email protected]: lodash@<2.0.0 is no longer maintained. Upgrade to lodash@^3.0.0
Should this section be updated? In my experience, this caches all routes defined after.
import apicache from 'apicache'
let cache = apicache.middleware
- app.use(cache('5 minutes'))
// routes are automatically added to index, but may be further added
// to groups for quick deleting of collections
app.get('/api/:collection/:item?', (req, res) => {
req.apicacheGroup = req.params.collection
res.json({ success: true })
})
// add route to display cache index
app.get('/api/cache/index', (req, res) => {
res.json(apicache.getIndex())
})
// add route to manually clear target/group
app.get('/api/cache/clear/:target?', (req, res) => {
res.json(apicache.clear(req.params.target))
})
+ app.use(cache('5 minutes'))
The source of issue appears to be that setTimeout breaks on any millisecond value that doesn't fit in a 32-bit int. See: ptarjan/node-cache#35
If there's no easy workaround (I suspect not), it'd be good to mention this limitation in the docs since it looks like the apicache code is designed to handle longer time periods like months/years.
As an aside it would be nice to have a "cache indefinitely until cleared" option โ my original use case for setting such a long time in the first place. memory-cache supports it by omitting the time
argument.
Im not exactly sure what is causing the issue but on occasion I get the following error.
TypeError: Cannot convert undefined or null to object
at Function.assign ()
at sendCachedResponse (C:\Users\Matthew\Desktop\CDMMS\node_modules\apicache\src\apicache.js:129:12)
at cache (C:\Users\Matthew\Desktop\CDMMS\node_modules\apicache\src\apicache.js:281:18)
at Layer.handle [as handle_request] (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\layer.js:95:5)
at next (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\route.js:131:13)
at Route.dispatch (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\route.js:112:3)
at Layer.handle [as handle_request] (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\layer.js:95:5)
at C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:277:22
at Function.process_params (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:330:12)
at next (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:271:10)
at Function.handle (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:176:3)
at router (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:46:12)
at Layer.handle [as handle_request] (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\layer.js:95:5)
at trim_prefix (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:312:13)
at C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:280:7
at Function.process_params (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:330:12)
at next (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:271:10)
at Function.handle (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:176:3)
at router (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:46:12)
at Layer.handle [as handle_request] (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\layer.js:95:5)
at trim_prefix (C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:312:13)
at C:\Users\Matthew\Desktop\CDMMS\node_modules\express\lib\router\index.js:280:7
Editing sendCachedResponse to include a check for null seems to resolve the issue
function sendCachedResponse(response, cacheObject) {
if (response._headers != null) {
Object.assign(response._headers, cacheObject.headers, {
'apicache-store': globalOptions.redisClient ? 'redis' : 'memory',
'apicache-version': pkg.version
})
}
Hi,
So, I am trying to serve different sub-domain from the same express server. The subdomains are language and country based (en-sa, ar-sa ...) and am parsing the context into a jade file before sending to client.
The issue is, as obvious, the caching is route based so for homepage it serves the same html for all sub-domains, '/'.
How do you think I can handle this to cache different homepages for different sub-domains ?
As it stands, the apicache middleware breaks Express's automatic handling of etags when the client supplies an If-None-Match
header. I'm presuming it's something to do with overiding res.end
but either way this is a useful feature that can reduce data sent to the client, which is especially useful in bandwidth constrained scenarios like mobile apps. It's a shame to have to chose between reducing traffic to the backend or payloads to the client. Here's a failing test case:
it('respects if-none-match header', function() {
var app = mockAPI.create('10 seconds')
return request(app)
.get('/api/movies')
.expect(200)
.then(function(res) {
return res.headers['etag']
})
.then(function(etag) {
return request(app)
.get('/api/movies')
.set('if-none-match', etag)
.expect(304)
.expect('etag', etag)
})
})
Hi,
These are the messages I am getting in the console. The calls are roughly 1 second apart.
[api-cache]: returning redis cached version of "/firstApi?param=value/appendKey=value"
[api-cache]: path "/firstApi?param=value/appendKey=value" not found in cache
[api-cache]: adding cache entry for "/firstApi?param=value/appendKey=value" @ 5000 milliseconds
[api-cache]: adding cache entry for "/firstApi?param=value/appendKey=value" @ 5000 milliseconds
[api-cache]: returning redis cached version of "/firstApi?param=value/appendKey=value"
[api-cache]: returning redis cached version of "/firstApi?param=value/appendKey=value"
[api-cache]: returning redis cached version of "/firstApi?param=value/appendKey=value"
[api-cache]: returning redis cached version of "/firstApi?param=value/appendKey=value"
[api-cache]: returning redis cached version of "/firstApi?param=value/appendKey=value"
[api-cache]: returning redis cached version of "/firstApi?param=anotherValue/appendKey=anotherValue"
[api-cache]: path "/firstApi?param=anotherValue/appendKey=anotherValue" not found in cache
[api-cache]: adding cache entry for "/firstApi?param=anotherValue/appendKey=anotherValue" @ 5000 milliseconds
[api-cache]: adding cache entry for "/firstApi?param=anotherValue/appendKey=anotherValue" @ 5000 milliseconds
What do you think the issue might be?
Trying to use appendKey in my app appendKey: ['user', '_id']
-- but req.user
is not set unless the user is logged in. This causes the middleware to throw TypeError: Cannot read property '_id' of undefined at Layer.cache (apicache.js:296:32)
I'd expect the default behavior to be - append with nothing if key not available
https://github.com/kwhitley/apicache/blob/master/src/apicache.js#L292-L299
Hi,
why when a cached URL is expired, it is still in the index?.
Example:
router.get('/', cache(2 seconds), function(req, res)
Now, I do 4 request, and getIndex returns:
"all": [
"/agents",
"/agents"
],
"groups": {
"agents": [
"/agents",
"/agents"
]
}
why there are 2 requests ("/agents") instead of just 1 request?. all the request showed in getIndex are in memory?. It seems like old cached requests are not deleted from the memory.
Thanks.
In particular, for testing sake and the ability to tell at a glance (from the client) if you're seeing a cached version or not - I'm going to embed (and namespace) some apicache response headers:
Thinking of something along the lines of:
{
apicache: 'v0.1.0', // on cached responses only
apicache-duration: '1 week' // because who reads in milliseconds?
}
Using code:
app.get('/api/collection/:id?', cache('5 minutes'), (req, res) => {
console.log("hit")
res.json({ foo: 'bar' })
})
Apicache overwrites the res.end and res.write methods in makeResponseCacheable in apicache.js but these methods are called with empty string as content when response is 304 but hit is still displayed. Is there anyway to workaround this?
Apicache is failing to cache the 'content-encoding' header. As a result, when you retrieve something from cache that is gzipped, the browser (or whoever is viewing the document from cache) does not realize that the content should be decompressed before display.
Hi,
Having a bit of an issue when trying to implement. My sample code is as follows:
var express = require('express');
var apicache = require('apicache');
apicache.options({ debug: true });
var cache = apicache.middleware();
var router = express.Router();
router.get('/test', cache('1 hour'), function (req, res) {
//code in here
};
When the app loads, I get the error 'Cannot read property 'x-apicache-bypass' of undefined'.
Any idea on what I might be missing?
Hi, How to set TTL for a record when using redis as cache.
Hi all. Need advise.
I'm trying to cache like this, but nothing happens.
const express = require('express');
const router = express.Router();
const apicache = require('apicache');
const backend = require('../integrations/backend');
let cache = apicache.middleware;
router.get('/events', cache('5 minutes'), function(req, res, next) {
backend.getEvents('',(result, error) => {
if (error) {
return;
}
res.setHeader('Content-Type', 'application/json');
res.send(result);
});
});
But if a get info from backend in sync manner - cache works.
Instead of "x-apicache-bypass": true
why not have cache-control: no-cache
and allow the request to set the cache length.
Great break down of the different options: http://www.mobify.com/blog/beginners-guide-to-http-cache-headers/
Right now, if you request the same URL concurrently, both requests will bypass the cache and drop into the express route. I think it would be preferable if only the request seen first by express drops into the route, and the 2nd request will await the response from the route, and return that.
I'm thinking that it should be possible to create a shared promise per URL, that once resolved, returns JSON to the clients. This way, 1000 concurrent requests would look like 1 request to the underlying app. I'm thinking that this is the more preferable than current behaviour.
This becomes a significant problem if you have a relatively short cache duration with a lot of concurrent requests, since every cache-duration seconds, your app will halt to a crawl before new cache keys get inserted.
I am happy with apicache module . especially the part where you can write human readable cache expire.
I am also having trouble with some couchdb/pouchdb backends which return 201 when posted , I can
statusCodes: {
exclude: ['201']
}
somehow same url returns 200 and future posts are ignored
my suggestion is to add ignore_hosts functionality
Thanks
Edit: Ignore methods as POST also would be great.
@kwhitley
Running the apicache with no parameters in the .clear() function gives that 'this.delete is not a function' at \apicache\lib\memory-cache.js:54:26
This would allow each segment of your app to have a unique instance (complete with rules, index, etc) rather than sharing one...
Also hugely helpful with creating unit tests.
Thoughts?
Obviously unless I can figure out a clever way to do this, this would count as a major breaking change and be slated for the future major release version.
Rather than manually wiring up individual routes for exposing cache, I'd like to include an apicache.control
middleware for easy injection into a project.
Example:
import express from 'express';
import apicache from 'apicache';
const app = express();
// basic example
app.use('/api/cache', apicache.control);
// basic example with authentication middleware to prevent unauthorized cache purges, etc
app.use('/api/cache', authMiddleware, apicache.control);
/* PROPOSED ROUTES
../cache/index --> shows index
../cache/clear --> clears all cache
../cache/clear/{group} --> clears cache group (if grouping used)
../cache/expires/{time} --> ??? set default timeout on the fly
*/
We have an Express Node Api which incudes diffrent controllers for API using Routes.Index.js
So when we Implement ApiCache it seems to return same Response even if we change request Parameter..
E.g :
Login API : ApiCache implemented
When we request first time we get data from API with Credentials like Username : aFirst and Password : aFirst
When we request SECOND time we get data from API with Credentials like Username : aSecond and Password : aSecond
It should Refresh cache when we change Input Parameter for GET and POST API.
Thanks.
I've tried following your instructions and I think I've done it right yet nothing is getting cached. Am I doing something wrong or is this a bug? Below is the code in which I am trying to use your cache. You'll see one line with req.apicacheGroup
commented out. I'd like to use this but disabled it in trying to debug stuff. The response given is a page full of JSON (not just a single key: value). Thanks for any insight you can provide.
var apicache = require('apicache').options({ debug: true }).middleware;
var converter = require('csvtojson').core.Converter;
var express = require('express');
var request = require('request');
var router = express.Router();
/*
* GET home page.
*/
router.get('/', function(req, res) {
res.render('index');
});
/*
* Routes per school
*/
// Pull in the file(s) from each school
var westga = require('./schools/westga');
// westga's routes
router.get('/westga/bulletin/:term', apicache('60 minutes'), function(req, res) {
//req.apicacheGroup = req.params.term
var term = req.params.term;
if ( westga.validTerms.indexOf(term) >= 0 ) {
res.writeHead(200, {"Content-Type": "application/json"});
switch (term) {
case 'fall':
var url = westga.bulletin + westga.fall;
break;
case 'spring':
var url = westga.bulletin + westga.spring;
break;
case 'summer':
var url = westga.bulletin + westga.summer;
break;
default:
console.log('Something went wrong...');
}
// Don't save everything to memory. This facilitates large CSV's
var csvConverterForWestgaBulletin = new converter({constructResult:false});
console.log(url);
request.get(url).pipe(csvConverterForWestgaBulletin).pipe(res);
} else {
res.status(404);
res.render('404.jade', {title: '404: File Not Found'});
console.log('invalid term provided');
}
});
router.get('/westga/catalog', apicache('1 month'), function(req, res) {
var url = westga.catalog + westga.currentCatalog;
// Don't save everything to memory. This facilitates large CSV's
var csvConverterForWestgaCatalog = new converter({constructResult:false});
console.log(url);
res.writeHead(200, {"Content-Type": "application/json"});
request.get(url).pipe(csvConverterForWestgaCatalog).pipe(res);
});
module.exports = router;
We are trying out Api cache and its working perfectly with redis client. But once we plugged in the SSO node module, read operation fails abruptly, i can see [api-cache]: returning redis cached version of "/api/userdetails"
but something going wrong after that.
If i remove the SSO implementation everything works fine.
This is the first time i'm trying out api cache with the app which works fine with SSO module.
Please share your thoughts regarding what might be going wrong
[api-cache]: returning redis cached version of "/api/userdetails"
Thu, 11 Aug 2016 16:26:17 GMT uncaughtException The header content contains invalid characters
TypeError: The header content contains invalid characters
at ServerResponse.OutgoingMessage.setHeader (_http_outgoing.js:348:11)
at ServerResponse.header (C:\Projects\hub\wayne\hub-webapp\node_modules\express\lib\response.js:719:10)
at ServerResponse.header (C:\Projects\hub\wayne\hub-webapp\node_modules\express\lib\response.js:722:12)
at C:\Projects\hub\wayne\hub-webapp\node_modules\apicache\lib\apicache.js:170:21
at tryCatcher (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\util.js:16:23)
at Promise.successAdapter [as _fulfillmentHandler0] (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\nodeify.js:23:30)
at Promise._settlePromise (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\promise.js:558:21)
at Promise._settlePromise0 (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\promise.js:606:10)
at Promise._settlePromises (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\promise.js:685:18)
at Async._drainQueue (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\async.js:138:16)
at Async._drainQueues (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\async.js:148:10)
at Immediate.Async.drainQueues [as _onImmediate] (C:\Projects\hub\wayne\hub-webapp\node_modules\bluebird\js\release\async.js:17:14)
at tryOnImmediate (timers.js:543:15)
at processImmediate [as _immediateCallback] (timers.js:523:5)
Hello all
We are starting to use apicache in dev and really like it.
Am wondering if anyone has done some memory usage analyses on production environment ? I would be glad to see your results if you have.
Thanks for sharing!
Hello,
I think a server serving multiple virtual hosts should not cache requests from other virtual hosts.
Each host should have a separate cache.
What do you think ?
When app restart cache index is not rebuilt. That is a problem when using redis cache, because without index I cannot clear the cache and I have to wait for items to expire in redis. Maybe index should be stored in redis too? Also if you are scaling to have many application instances, the index should be shared between them.
res.send
is an express method that calls res.write
and res.end
which are node http method. You should monkey patch res.write
so that your lib works without express.
You might also look into supporting chunked encoding, check out this impl:
http://www.senchalabs.org/connect/compress.html
I'm using apicache to cache an expense request to an upstream server that sometimes times out. However when the upstream request does timeout, I want to return an error response and not cache it so that the next time I will retry the upstream operation again.
So it's not clear how to say from my route handler post-apicache middleware, "don't cache the request this time." I've tried adding apicache.clear(req.url);
in my handler when I return my error response but that doesn't appear to work.
Currently the fixed 0.0.5 version of memory-cache is used.
Sorry, I don't have time at all to make a PR.
src/apicache.js, line 166
var group = index.groupss[groupName]
should be
var group = index.groups[groupName]
That's a nice project btw, thanks for making it.
With Restify 4.3 I've encountered an issue with headers being dropped when a response is being returned from the cache. I've localised the issue to here:
function sendCachedResponse(response, cacheObject) {
Object.assign(response._headers || {}, cacheObject.headers || {}, {
'apicache-store': globalOptions.redisClient ? 'redis' : 'memory',
'apicache-version': pkg.version
})
// unstringify buffers
var data = cacheObject.data
if (data && data.type === 'Buffer') {
data = new Buffer(data.data)
}
response.writeHead(cacheObject.status || 200, response._headers)
return response.end(data, cacheObject.encoding)
}
When response._headers
is null
the code falls back to {}
but because the result of Object.assign
is not saved in a variable it gets dropped silently. The reason why we don't see this in Express is because it always adds the X-Powered-By: 'Express'
header so res._headers
is never empty (although anyone who disables this feature will see this problem).
The solution is very simple and I'd be more than happy to raise a PR for this:
function sendCachedResponse(response, cacheObject) {
// Should the res object be mutated here rather than cloned as I've done?
const headers = Object.assign({}. response._headers || {}, cacheObject.headers || {}, {
'apicache-store': globalOptions.redisClient ? 'redis' : 'memory',
'apicache-version': pkg.version
})
// ...
response.writeHead(cacheObject.status || 200, headers)
// ...
}
Encoding issues with gzip on subsequent calls
Hi,
I have a Stream which is piped into the response which emits Buffers with a single String at the end.
I have added an else if block to handle chunks of type Buffer. Could you please add it to the package and release a new version?
Cheers
function accumulateContent(res, content) {
if (content) {
if (typeof(content) == 'string') {
res._apicache.content = (res._apicache.content || '') + content;
} else if (Buffer.isBuffer(content)) {
var oldContent = res._apicache.content
if (!oldContent) {
oldContent = Buffer.alloc(0);
}
res._apicache.content = Buffer.concat([oldContent, content], oldContent.length + content.length);
} else {
res._apicache.content = content
// res._apicache.cacheable = false;
}
}
}
There should be a method called like apicache.gettimestamp(index)
test with non JSON/text formats
Hi! I have some question, mb i didn't understand something.
const onlyStatus200 = (req, res) => res.statusCode === 200
const cacheSuccesses = cache('5 minutes', onlyStatus200)
app.get('/api/missing', cacheSuccesses, (req, res) => {
res.status(404).json({ results: 'will not be cached' })
})
this.middleware = function cache(strDuration, middlewareToggle) {
var duration = instance.getDuration(strDuration)
return function cache(req, res, next) {
function bypass() {
debug('bypass detected, skipping cache.')
return next()
}
// initial bypass chances
if (!globalOptions.enabled) return bypass()
if (req.headers['x-apicache-bypass'] || req.headers['x-apicache-force-fetch']) return bypass()
if (typeof middlewareToggle === 'function') {
if (!middlewareToggle(req, res)) return bypass()
} else if (middlewareToggle !== undefined && !middlewareToggle) {
return bypass()
}
So onlyStatus200 middleware is called before we have some operations with response. And it would be always 200 as initial, so actually all statuses would be cached.
Option statusCodes: [ include: [200] ] works as should.
Thanks.
I find the two header have the same effect. what's the purpose of this two different headers?
And at the same time, the x-apicache-force-fetch header will make the server block because it does not call next()ใ
Hi all.
I heard that there was a future plan to implement redis as the cache store. Is anyone working on this at the moment?
I am running a clustered express api and the cache needs to be common across all processes.
I will make this change and provide a PR if nobody else is going to soon.
Ill provide an optional parameter which is a redis client instance. So redis client will be maintained outside of the module. Is this okay? Else i could include the redis client inside apicache and simply pass options instead.
Let me know which way is preferred
Thanks
Nathan
Hey @kwhitley I was trying to use this library, but what I see it sets "Cache-Control" to chosen cache duration. I was intending to spare the server heavy database tasks and the cache is controlled by the server, meaning it could be cleared anytime. Setting "Cache-Control" with fixed max-age prevents some clients from hitting my endpoints, even when the cache was cleared by the server. Any workarounds? I believe Cache-Control shouldn't be set in this library or it should be an option.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.