Giter VIP home page Giter VIP logo

toad-cache's Introduction

Toad Cache

NPM Version NPM Downloads Coverage Status

Least-Recently-Used and First-In-First-Out caches for Client or Server.

Getting started

import { Lru, Fifo } from 'toad-cache'
const lruCache = new Lru(max, ttl = 0)
const fifoCache = new Fifo(max, ttl = 0)

clear

Method

Clears the contents of the cache

Example

cache.clear()

delete

Method

Removes item from cache

param  {String} key Item key

Example

cache.delete('myKey')

deleteMany

Method

Removes items from cache

param  {String[]} keys Item keys

Example

cache.deleteMany(['myKey', 'myKey2'])

evict

Method

Evicts the least recently used item from cache

Example

cache.evict()

expiresAt

Method

Gets expiration time for cached item

param  {String} key Item key
return {Mixed}      Undefined or number (epoch time)

Example

const item = cache.expiresAt('myKey')

first

Property

Item in "first" or "bottom" position

Example

const cache = new Lru()

cache.first // null - it's a new cache!

get

Method

Gets cached item and marks it as recently used (pushes to the back of the list of the candidates for the eviction)

param  {String} key Item key
return {Mixed}      Undefined or Item value

Example

const item = cache.get('myKey')

getMany

Method

Gets multiple cached items and marks them as recently used (pushes to the back of the list of the candidates for the eviction)

param  {String[]} keys Item keys
return {Mixed[]}      Undefined or Item values

Example

const item = cache.getMany(['myKey', 'myKey2'])

keys

Method

Returns an Array of cache item keys.

return {Array} Array of keys

Example

console.log(cache.keys())

max

Property

Max items to hold in cache (1000)

Example

const cache = new Lru(500)

cache.max // 500

last

Property

Item in "last" or "top" position

Example

const cache = new Lru()

cache.last // null - it's a new cache!

set

Method

Sets item in cache as first

param  {String} key   Item key
param  {Mixed}  value Item value

Example

cache.set('myKey', { prop: true })

size

Property

Number of items in cache

Example

const cache = new Lru()

cache.size // 0 - it's a new cache!

ttl

Property

Milliseconds an item will remain in cache; lazy expiration upon next get() of an item

Example

const cache = new Lru()

cache.ttl = 3e4

Hit/miss/expiration tracking

In case you want to gather information on cache hit/miss/expiration ratio, as well as cache size and eviction statistics, you can use LruHitStatistics class:

const sharedRecord = new HitStatisticsRecord() // if you want to use single record object for all of caches, create it manually and pass to each cache

const cache = new LruHitStatistics({
  cacheId: 'some-cache-id',
  globalStatisticsRecord: sharedRecord,
  statisticTtlInHours: 24, // how often to reset statistics. On every rotation previously accumulated data is removed
  max: 1000,
  ttlInMsecs: 0,
})

You can retrieve accumulated statistics from the cache, or from the record directly:

// this is the same
const statistics = sharedRecord.getStatistics()
const alsoStatistics = cache.getStatistics()

/*
{
  'some-cache-id': {
    '2023-04-06': {
      cacheSize: 100, // how many elements does cache currently have
      evictions: 5, // how many elements were evicted due to cache being at max capacity    
      expirations: 0, // how many elements were removed during get due to their ttl being exceeded
      hits: 0, // how many times element was successfully retrieved from cache during get
      emptyHits: 0, // out of all hits, how many were null, undefined or ''?
      falsyHits: 0, // out of all hits, how many were falsy?      
      misses: 1, // how many times element was not in cache or expired during get
      invalidateOne: 1, // how many times element was invalidated individually
      invalidateAll: 2, // how many times entire cache was invalidated
      sets: 0, // how many times new element was added      
    },
  },
}

Note that date here reflects start of the rotation. If statistics weren't rotated yet, and another day started, it will still be counted against the day of the rotation start
*/

License

Copyright (c) 2023 Igor Savin

Based on tiny-lru, created by Jason Mulligan

Licensed under the MIT license.

toad-cache's People

Contributors

kibertoad avatar dependabot[bot] avatar gurgunday avatar tobiasdcl avatar

Stargazers

 avatar  avatar Mateus Sampaio avatar Brett Fincher avatar Marius Sebastian Besel avatar BlasterPistol avatar Mike Presman avatar dovran  avatar Imam Fahrur Rofi avatar Nex Zhu avatar Donatas avatar  avatar  avatar Robert McGuinness avatar Ivan Zhuravlev avatar Andrei Pechkurov avatar

Watchers

 avatar Aras Abbasi avatar  avatar  avatar

toad-cache's Issues

make Map the default (round 2)

I finally cracked the code after digging through more benchmarks and stumbling upon the comment of a V8 developper

TLDR is that if we use ordered keys as we do in the benchmarks, V8 will highly optimize property access. To illustrate, if we just force the keys to be non-numeric, with something like x + '_', the picture becomes much more interesting :)

Objection

Benchmarks

Let's modify the execute function to make our keys nonnumeric, the following results surface for cache-get-inmemory/:

{ cpu: { brand: 'M1', speed: '2.40 GHz' } }
| Node   | Option             | Msecs/op       | Ops/sec  | V8                    |
| ------ | ------------------ | -------------- | -------- | --------------------- |
| 20.9.0 | toad-cache-lru-map | 0.434283 msecs | 2302.644 | V8 11.3.244.8-node.16 |
| 20.9.0 | toad-cache-lru     | 0.601765 msecs | 1661.777 | V8 11.3.244.8-node.16 |

Here's the execute functions of toad-cache-lru-map.js and toad-cache-lru-object.js:

async function execute() {
  for (var x = 0; x < ELEMENT_COUNT; x++) {
    const value = cache.get(x + 'a') // became x + 'a'
    if (!value) {
      const newValue = {
        id: x
      }

      cache.set(x + 'a', newValue) // became x + 'a'
    }
  }
}

Then, I realized another potential issue with the default parameters — MAX_ITEMS was higher than ELEMENT_COUNT

Ideally, MAX_ITEMS should be less than ELEMENT_COUNT so that we can have deletions as well to get a more complete picture of performance

Let's modify common.js:

const { validateEqual } = require("validation-utils");

const MAX_ITEMS = 5000 // became 5000 from 10000
const TTL = 1000 * 60 * 60
const ELEMENT_COUNT = 8500 // became 8500 from 5000

async function validateAccuracy(getFn) {
  const value = await getFn('123')
  validateEqual(value.id, '123')
}

module.exports = {
  ELEMENT_COUNT,
  MAX_ITEMS,
  TTL,
  validateAccuracy,
};

Here are the results of a rerun:

{ cpu: { brand: 'M1', speed: '2.40 GHz' } }
| Node   | Option             | Msecs/op       | Ops/sec | V8                    |
| ------ | ------------------ | -------------- | ------- | --------------------- |
| 20.9.0 | toad-cache-lru-map | 1.387450 msecs | 720.747 | V8 11.3.244.8-node.16 |
| 20.9.0 | toad-cache-lru     | 2.053608 msecs | 486.948 | V8 11.3.244.8-node.16 |

I argue that a generic LRU will not have number-based keys, especially not the very special case of using consecutive integers as keys — if that was the case, we'd use an array — and I propose once again to make the default export Map-based

What do you think?

If you do not agree, it's fine! But I feel like repos such as fastify-rate-limit would greatly benefit from the change

Cannot run benchmark

I wanted to try some perf ideas out for this package, and want to use the provided benchmark to measure results. However, the benchmark is expecting a dependency that is not available:

~/P/toad-cache on main ◦ npm run benchmark                                                                                                                                                                    21:01:26

> [email protected] benchmark
> node benchmark.js

node:internal/errors:496
    ErrorCaptureStackTrace(err);
    ^

Error [ERR_MODULE_NOT_FOUND]: Cannot find module '/Users/matt/Projects/toad-cache/dist/tiny-lru.esm.js' imported from /Users/matt/Projects/toad-cache/benchmark.js
    at new NodeError (node:internal/errors:405:5)
    at finalizeResolution (node:internal/modules/esm/resolve:225:11)
    at moduleResolve (node:internal/modules/esm/resolve:837:10)
    at defaultResolve (node:internal/modules/esm/resolve:1035:11)
    at DefaultModuleLoader.resolve (node:internal/modules/esm/loader:251:12)
    at DefaultModuleLoader.getModuleJob (node:internal/modules/esm/loader:140:32)
    at ModuleWrap.<anonymous> (node:internal/modules/esm/module_job:76:33)
    at link (node:internal/modules/esm/module_job:75:36) {
  code: 'ERR_MODULE_NOT_FOUND'
}

Node.js v20.4.0

make Map the default

Hey, happy user here

I'm interested in centralizing all Lru and Fifo packages to yours within the Fastify ecosystem

Just wanted to ask if you would be interested in making the Map versions of the caches the default named exports (Lru and Fifo)

In recent versions of V8, Map essentially always outperforms Object when it comes to read and write speeds

Now, after seeing the perf difference, you might say, "This is the difference? So what?", but let me point out a property of Objects that makes them unsuitable for generic Map functionality — you probably know the others but I'll leave this MDN link as well

Even if you have a null prototype, weird stuff can be done:

let lru = new Lru()
lru.set('toString', () => "Would throw but no longer"})
`${lru.items}` // Output: Would throw but no longer

lru.set(Symbol.iterator, 'Breaks `this.items` iterator')

I know that these aren't even possible in most cases, but they still make it clear that Map is simply more consistent at being... a Map!

In my opinion, objets are only superior when their keys are known at creation time

Benchmark (node ^20)

BENCHMARK  100
Object write took 0.060791969299316406
Object read took 0.1952500343322754
Map write took 0.01816701889038086
Map read took 0.01595902442932129
BENCHMARK  1000
Object write took 0.3558340072631836
Object read took 0.10799980163574219
Map write took 0.10654187202453613
Map read took 0.10749983787536621
BENCHMARK  10000
Object write took 3.391292095184326
Object read took 1.8417911529541016
Map write took 1.2463748455047607
Map read took 1.3308749198913574
BENCHMARK  1000000
Object write took 471.0659999847412
Object read took 242.29637503623962
Map write took 291.88587498664856
Map read took 224.64337515830994

Code

function benchmark(TIMES) {  
  console.log("BENCHMARK ", TIMES);

  const object = Object.create(null);

    let start = performance.now();
    for (let i = 0; i < TIMES; ++i) {
        object[`key_${i}`] = 1;
    }

    console.log("Object write took", performance.now() - start);
    start = performance.now();

    let result = 0;
  
    for (let i = 0; i < TIMES; ++i) {
      result += object[`key_${i}`];
    }

    console.log("Object read took", performance.now() - start);
    start = performance.now();

  


  const map = new Map();
  
  for (let i = 0; i < TIMES; ++i) {
    map.set(`key_${i}`, 1);
  }
  
  console.log("Map write took", performance.now() - start);
  start = performance.now();

  result = 0;
  
  for (let i = 0; i < TIMES; ++i) {
    result += map.get(`key_${i}`);
  }

  console.log("Map read took", performance.now() - start);

}

benchmark(100);
benchmark(1_000);
benchmark(10_000);
benchmark(1_000_000);

Bug when calling `get` on only entry in `LruObject`

Hey folks,
first of all thanks for building and maintaining this library ❤️


I stumbled across this issue when investigating failures coming from fastify-rate-limit. I created an issue there because thats where I noticed the impact first.

However based on my current understanding there is an issue in the LruObject implementation - more specifically in the get implementation. Here is a simple test case that I wrote:

  describe('get', () => {
    it('does not overwrite cache.first', async () => {
      cache = new LruObject(5, 500)

      const key = '10.0.0.1'
      const value = 100

      cache.set(key, value)
      expect(cache.first).not.toBeNull()

      cache.get(key)
      expect(cache.first).not.toBeNull()
    })
  })
 FAIL  test/LruObject.spec.js > LruObject > get > does not overwrite cache.first
AssertionError: expected null not to be null
 ❯ test/LruObject.spec.js:77:31
     75| 
     76|       cache.get(key)
     77|       expect(cache.first).not.toBeNull()
       |                               ^
     78|     })
     79| 

Can you confirm that this is indeed not intended?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.