Giter VIP home page Giter VIP logo

Comments (10)

niks3089 avatar niks3089 commented on August 26, 2024 1

Been testing for couple of days and it's working great. Feel free to close the ticket. Thanks for the quick turnaround

from leaky-bucket.

udoprog avatar udoprog commented on August 26, 2024

So something like this?

if !limiter.try_acquire(100) {
    return Err(..);
}

How would you like it to behave if there are tasks waiting for tokens?

For a fair scheduler (the default) I believe the "correct" behavior under that scenario should be for try_acquire to return false, since there are tasks ahead which takes priority. Does this sound right to you?

from leaky-bucket.

niks3089 avatar niks3089 commented on August 26, 2024

Yes, returning false also works. As long as the task knows it can't get the tokens and can try again later, it should do the trick

from leaky-bucket.

udoprog avatar udoprog commented on August 26, 2024

There is an implementation in git now if you want to give it a try and report back.

from leaky-bucket.

niks3089 avatar niks3089 commented on August 26, 2024

Thanks, will give it a test. I see that try_acquire returns true if there's enough tokens. Can we change that to actually acquire the tokens. With current try_acquire, its going be like this

if !limiter.try_acquire(100) {
    return Err(..);
}
limiter.acquire(100)

Instead can we do the above in the same try_acquire?

from leaky-bucket.

udoprog avatar udoprog commented on August 26, 2024

Ah. That's actually how try_acquire works, the documentation was a bit fuzzy, so I fixed that.

from leaky-bucket.

niks3089 avatar niks3089 commented on August 26, 2024

Awesome, thanks. I will give it a test over the weekend and update

from leaky-bucket.

niks3089 avatar niks3089 commented on August 26, 2024

Sorry, didn't get time to test this. Will test and update asap this week

from leaky-bucket.

niks3089 avatar niks3089 commented on August 26, 2024

@udoprog, I am seeing this behaviour for this code

        let limiter = LeakyRateLimiter::builder()
            .initial(group.global_max_burst.get() as usize)
            .refill(group.request_per_second.get() as usize)
            .interval(Duration::from_millis(1000))
            .max(group.global_max_burst.get() as usize)
            .build();

The rps is am setting is 500 with max burst being 1500. When I test with limiter.try_acquire of lets say as large number of 10000, I get false but after sometime, when I test again, I get true which seems like the bucket keeps accumulating tokens beyond max which seems like a bug. I would assume any request beyond max should return false always

from leaky-bucket.

udoprog avatar udoprog commented on August 26, 2024

Ah, yeah. It seems I reproduced an existing issue that I've already fixed in the async version when implementing try_acquire.

This part here should actually be doing this:

state.balance = state.balance.saturating_add(tokens).min(self.max);

Thanks for testing it out!

I can't fix it right now, but if you want to do a PR, feel free!

from leaky-bucket.

Related Issues (9)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.