Comments (6)
Yes this was considered, but decided against since it would require a dual cache: caching both the promise for a value and the value itself, and since DataLoader's primary target usage is in batching Promise-based APIs that this would be unnecessary complication and overhead.
from dataloader.
ok yeah, I definitely understand this choice! I used to wrote some loader libs with the 2 different paradigm too (some using get. some avoiding it because the overhead you talk about).
What I experienced though is Promise only APIs tends to force you to always stay in the Promise paradigm and sometimes it's not easy to do it. especially when you want to make another other "core" libraries on top of it, if it's a promise only API, you might miss some control to do what you want (anytime you want to check things in sync, basically).
in the React example shown above, if you absolutely want to avoid the <div>loading...</div>
step, I guess you could say that the component have to be more "dump" and the data props is the already resolved/loaded one. So it means you would have to push up in the tree the responsability to load the thing, so the parent would have to wait the state for you. One problem is this breaks a bit the modularity/isolation idea (parent needs to know a bit more, than just "i give you an id prop"), but maybe that's ok in some use-cases.
I guess the ultimate solution then is to build a decorator that do this for you, a bit like Relay createContainer. It works but it's a bit more opinionated and you have to make your tree using it (like you have in Relay, to make a "Root Container" and propagate relay
everywhere).
from dataloader.
I remember one use case I had is to make my "super lib" that use the micro-loader-lib to exposes a isReady(): boolean
predicate function – which is basically pretty useful to call from a React render()
.
I can't do it (easily) if the micro-loader-lib don't exposes a way to check synchronously too, it will only be able to expose a something like ready(): Promise
from dataloader.
I guess this concern is now fixed with upcoming createFetcher :) can't wait https://youtu.be/D-h3bhzauKo?t=1h16m40s
from dataloader.
A way to synchronously check if data is loaded or not for a particular ID is really import when testing a function and then afterwards asserting that it primed data loaders with the right data for specific IDs. In my case, I am testing that a GraphQL mutation didn't just mutate data correctly in the DB, but that it also primed data loaders correctly with the relevant data.
Right now we're doing these complicated workarounds:
import type { CacheMap } from "dataloader";
import DataLoader from "dataloader";
/**
* Gets a {@linkcode DataLoader} instance cache map. Useful for asserting data
* loader behavior in tests.
*/
export default function dataLoaderGetCacheMap<K, V>(
dataLoader: DataLoader<K, V>
) {
if (!(dataLoader instanceof DataLoader))
throw new TypeError(
"Argument 1 `dataLoader` must be a `DataLoader` instance."
);
const cacheMap =
// Read a private instance property as there is no public API for this.
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access, @typescript-eslint/no-explicit-any
(dataLoader as any)._cacheMap as CacheMap<K, Promise<V>> | null;
if (!cacheMap) throw new TypeError("Cache map missing.");
return cacheMap;
}
Then in tests, it's used like this:
const dataLoaderCacheMap = dataLoaderGetCacheMap(theDataLoaderInstance);
deepEqual(
await dataLoaderCacheMap.get(theIdToCheck),
{
// Whatever…
}
);
The problem is, because the .get
call returns a promise, the test can't tell the difference between data that is currently loading or data that was correctly primed and is already cached, ready to go.
from dataloader.
Complicated workaround to the above dilemma:
import DataLoader from "dataloader";
import dataLoaderGetCacheMap from "./dataLoaderGetCacheMap.mjs";
/**
* Gets a {@linkcode DataLoader} instance cached value by key, without causing
* loading.
* @see https://github.com/graphql/dataloader/issues/28#issuecomment-1459979582
* @returns Resolves the data loader value for the specified key if it’s cached,
* or else rejects with an error that the entry is missing or loading.
*/
export default async function dataLoaderGetValue<K, V>(
dataLoader: DataLoader<K, V>,
key: K
): Promise<V | void> {
if (!(dataLoader instanceof DataLoader))
throw new TypeError(
"Argument 1 `dataLoader` must be a `DataLoader` instance."
);
const dataLoaderCacheMap = dataLoaderGetCacheMap(dataLoader);
const value = dataLoaderCacheMap.get(key);
if (!(value instanceof Promise))
throw new TypeError(
`Data loader key \`${String(key)}\` value isn’t cached.`
);
return await Promise.race([
// This must be first in the array because when multiple racing promises are
// settled, the earliest one in the input array wins.
value,
Promise.reject(
new TypeError(`Data loader key \`${String(key)}\` value is loading.`)
),
]);
}
You use it like this in tests:
deepEqual(
await dataLoaderGetValue(theDataLoaderInstance, theIdToCheck),
{
// Whatever…
}
);
It's not sync though, and all of this complexity is something that should really be handled by the dataloader
package.
from dataloader.
Related Issues (20)
- Move CI to GitHub Actions
- Setup publish token for CI
- Change default branch `master` to `main` HOT 1
- Setup Renovate Bot to keep dependencies up to date HOT 3
- [QUESTION] Why aren't keys being de-duplicated even when the cache is disabled? HOT 6
- ✨ [REQUEST]: Add examples for cacheKeyFn and cacheMap HOT 2
- [REQUEST]: How to pass auth headers to dataloader HOT 4
- [QUESTION] README.md doesn't show on npmjs.com HOT 2
- CacheMap.get calls should await the promised value HOT 1
- [QUESTION] Release changes? HOT 9
- [QUESTION] name is now required in options? HOT 4
- [BUG] Readme is missing from NPM HOT 2
- [BUG] `this` context not available in typescript definitions
- [BUG] Use caution with `jest.useFakeTimers()` HOT 1
- [REQUEST] support vercel edge functions by default HOT 4
- Ever considered the use of `queueMicrotask`? HOT 1
- Coalescing multiple load() calls HOT 3
- [REQUEST] Unify the way `load` and `loadMany` handle errors
- -
- [REQUEST] Add batch grouping
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dataloader.