Comments (5)
The cache here is extremely simple, it's really just memoization. I think adding secondary key to the cache would be pretty complicating.
At Facebook we actually have a two-phase load. vanity => userid first then userid => user object. There's two ways I can think of to do this:
// follow-up in user code
var vanityLoader = new DataLoader(
vanities => promiseVanityToUserIDs(vanities)
)
var userLoader = new DataLoader(
ids => promiseUsersByID(ids)
)
var ide = await vanityLoader.load('ide').then(id => userLoader.load(id));
// or follow-up in loader code
var userLoader = new DataLoader(
ids => promiseUsersByID(ids)
)
var vanityLoader = new DataLoader(
vanities => promiseVanityToUserIDs(vanities).then(
ids => userLoader.loadMany(ids)
)
)
var ide = await vanityLoader.load('ide');
from dataloader.
I wonder what the simplest possible approach would be to supporting a two-way cache? Maybe a cache population method? Or just providing a custom cache backend?
What if you could write: loader.prime(key, value)
var userLoader = new DataLoader(
ids => promiseUsersByID(ids).then(users => {
users.forEach(user => vanityLoader.prime(user.vanity, user));
return users;
})
)
var vanityLoader = new DataLoader(
vanities => promiseUsersByVanity(vanities).then(users => {
users.forEach(user => userLoader.prime(user.id, user));
return users;
})
)
// Possible race condition? What should happen?
var [ ide1, ide2 ] = await Promise.all(
vanityLoader.load('ide'),
userLoader.load(589436611)
)
console.log(ide1 === ide2)
from dataloader.
@leebyron thanks - I wrote up something like the example in your first comment last night and it's working fine. In my case it felt a little silly to have the second loader because the second key (e.g. vanity) was on the user object so I was reading from the same table in both loaders instead of doing just one query.
loader.prime
is an interesting idea. To address the race condition I think the DataLoader options would need to include some kind of conflict-resolution policy: first write wins, last write wins, db always wins over prime, or an arbitrary function.
The thing I like about the custom cache backend is that you can add LRU policies and other behavior. One idea I was toying with is to keep per-user DataLoaders across requests for a few minutes so being able to control memory usage is important (this idea only works if you have a small number of servers or session stickiness though).
Still leaning towards loader.prime
since it's simple and can be called from anywhere but probably need to understand more use cases.
from dataloader.
The thing I like about the custom cache backend is that you can add LRU policies and other behavior.
Most definitely. This occurred to me recently as well. A memoization cache is only acceptable for short-lived caches. You would definitely need something like an LRU cache if you wanted a longer-lived loader.
from dataloader.
This is now possible in v1.2.0.
The race condition discussed here is more trivially and acceptably handled by making it a race between the synchronous calls to load
and prime
rather than making calling prime
race with the async return of load
.
from dataloader.
Related Issues (20)
- Move CI to GitHub Actions
- Setup publish token for CI
- Change default branch `master` to `main` HOT 1
- Setup Renovate Bot to keep dependencies up to date HOT 3
- [QUESTION] Why aren't keys being de-duplicated even when the cache is disabled? HOT 6
- ✨ [REQUEST]: Add examples for cacheKeyFn and cacheMap HOT 2
- [REQUEST]: How to pass auth headers to dataloader HOT 4
- [QUESTION] README.md doesn't show on npmjs.com HOT 2
- CacheMap.get calls should await the promised value HOT 1
- [QUESTION] Release changes? HOT 9
- [QUESTION] name is now required in options? HOT 4
- [BUG] Readme is missing from NPM HOT 2
- [BUG] `this` context not available in typescript definitions
- [BUG] Use caution with `jest.useFakeTimers()` HOT 1
- [REQUEST] support vercel edge functions by default HOT 4
- Ever considered the use of `queueMicrotask`? HOT 1
- Coalescing multiple load() calls HOT 3
- [REQUEST] Unify the way `load` and `loadMany` handle errors
- -
- [REQUEST] Add batch grouping
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dataloader.