linagirl / leaky-bucket Goto Github PK
View Code? Open in Web Editor NEWA fast and efficient leaky bucket implementation
Home Page: https://www.npmjs.com/package/leaky-bucket
License: MIT License
A fast and efficient leaky bucket implementation
Home Page: https://www.npmjs.com/package/leaky-bucket
License: MIT License
I'm trying to use this package in a client-side build. I'm unable to do so because .mjs
files are not supported (out of the box, at least). Renaming the files to .js
works, and the import can be found, but the module also uses logd
, which is node.js only as it uses fs
.
I made PR that uses rollup to compile both .js
and .ejs
and replaces logd
with debug
. Let me know what you think!
There's a typo in your dockets (a "bucket" is misspelled)
hi
i need use leaky bucket in express app for api and i do not know how do it
please help me
thanks
On a mac.
bash-3.2$ node -v
v8.9.4
bash-3.2$ npm -v
5.6.0
bash-3.2$ npm install leaky-bucket
npm WARN [email protected] No repository field.
npm WARN [email protected] No license field.
var LeakyBucket = require('leaky-bucket');
TypeError: path must be a string or Buffer
at Object.fs.statSync (fs.js:948:11)
at classConstructor.init (/Users/davidh/WebstormProjects/webClientSimulator/node_modules/ee-log/lib/Logger.js:59:15)
at new classConstructor (/Users/davidh/WebstormProjects/webClientSimulator/node_modules/ee-class/src/class.js:162:47)
at /Users/davidh/WebstormProjects/webClientSimulator/node_modules/ee-log/lib/Logger.js:52:12
If the capacity is 100 and the interval is 60 seconds and a request has a cost of 1, every 60 seconds 1000 request may be processed. If the request cost is 4, jsut 25 requests may be processed every 60 seconds.
Did taht first sentence mean to say every 60 seconds 100 requests may be processed
?
If the README is correct, why is it only 25
if the cost becomes 4
?
Hi, I have such config:
new LeakyBucket({
capacity: 1000,
interval: 60,
timeout: 300,
})
Version of the leaky-bucket package I use: v3.0.4
The API server returns wasted cost on each request. Each request has 50 points cost (await bucket.throttle(50)
). For some reason first wave of my requests break the leaky bucket, remote API server returns 1500 wasted points and I'm getting ban. If the service doesn't ban me, then the bucket starts to work properly and the remote API returns wasted points stays below capacity
all the rest time.
I'm not sure how to write a test for the case. But I found a workaround, right after LeakyBucket creation I pay full capacity
points and it start to work as it is expected:
const capacity = 1000;
const bucket = new LeakyBucket({
capacity,
interval: 60,
timeout: 300,
});
bucket.pay(capacity); // if you remove this line the remote api service will ban you
(async () => {
for (let i = 0; i < 5000; ++i) {
await bucket.throttle(50);
const response = await fetch(...)
}
})();
P.S. sorry for my English
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.