Giter VIP home page Giter VIP logo

cached's Introduction





Test status   stars   pub package   GitHub license   style:linteo  


Cached

Simple Dart package with build-in code generation. It simplifies and speedup creation of cache mechanism for dart classes.

Least Recently Used (LRU) cache algorithm

It is a finite key-value map using the Least Recently Used (LRU) algorithm, where the most recently-used items are "kept alive" while older, less-recently used items are evicted to make room for newer items.

Useful when you want to limit use of memory to only hold commonly-used things or cache some API calls.

Contents

Motivation

There is quite often situation, that you have to cache something in memory for later usage. Common case is cache some API calls and theirs responses. Usually, it is done in some data layer, probably in - let say - RemoteRepository

Oftentimes, the repository code might look like this:

class RemoteRepository implements Repository {
  final SomeApiDataSource _dataSource;
  final SomeResponseType? cachedResponse;

  const RemoteRepository(this._dataSource);

  @override
  Future<SomeResponseType> getSthData() async {
    if (cachedResponse != null) {
      return cachedResponse;
    }

    cachedResponse = await _dataSource.getData();
    return cachedResponse;
  }
}

So, instead of doing it manually we can use library and write our RemoteRepository in that way:

@WithCache()
abstract mixin class RemoteRepository implements Repository, _$RemoteRepository {
  factory RemoteRepository({required SomeApiDataSource dataSource,}) = _RemoteRepository;

  @Cached()
  Future<SomeResponseType> getSthData() {
    return dataSource.getData();
  }
}

Setup

Install package

Run command:

flutter pub add --dev cached
flutter pub add --dev build_runner
flutter pub add cached_annotation

Or manually add the dependencies in the pubspec.yaml

dependencies:
  cached_annotation:

dev_dependencies:
  cached:
  build_runner:

That's it! Now, you can write your own cached class 🎉

Run the generator

To run the code generator, execute the following command:

dart run build_runner build

For Flutter projects, you can run:

flutter pub run build_runner build

Note that like most code-generators, [Cached] will need you to both import the annotation ([cached_annotation]) and use the part keyword on the top of your files.

As such, a file that wants to use [Cached] will start with:

import 'package:cached_annotation/cached_annotation.dart';

part 'some_file.cached.dart';

Dart 3 changes

Dart 3 introduces a change in the way how mixins work, requiring them to be declared with the mixin keyword. If you are migrating from Dart 2 to Dart 3, you need to add the mixin keyword to your Cached class declarations.

Dart 3:

@WithCache()
abstract mixin class Gen implements _$Gen {
  factory Gen() = _Gen;

  ...
}

Dart 2:

@WithCache()
abstract class Gen implements _$Gen {
  factory Gen() = _Gen;

  ...
}

Basics

WithCache

Annotation for Cached package.

Annotating a class with @WithCache will flag it as a needing to be processed by Cached code generator.
It can take one additional boolean parameter useStaticCache. If this parameter is set to true, generator will generate cached class with static cache. It means each instance of this class will have access to the same cache. Default value is set to false

@WithCache(useStaticCache: true)
abstract mixin class Gen implements _$Gen {
  factory Gen() = _Gen;

  ...
}

Cached

Method/Getter decorator that flag it as needing to be processed by Cached code generator.

There are 4 possible additional parameters:

  • ttl - time to live. In seconds. Set how long cache will be alive. Default value is set to null, means infinitive ttl.
  • syncWrite - Affects only async methods ( those one that returns Future ) If set to true first method call will be cached, and if following ( the same ) call will occur, all of them will get result from the first call. Default value is set to false;
  • limit - limit how many results for different method call arguments combination will be cached. Default value null, means no limit.
  • where - function triggered before caching the value. If returns true: value will be cached, if returns false: value wil be ignored. Useful to signal that a certain result must not be cached, but @IgnoreCache is not enough (e.g. condition whether or not to cache known once acquiring data)

Important:

Please note, that persistentStorage is marked as @Deprecated and will be removed with next release. We encourage the use of @PersistentCached annotation

  • persistentStorage - Defines optional usage of external persistent storage (e.g. shared preferences). If set to true in order to work, you have to set PersistentStorageHolder.storage in your main.dart file. Check the Persistent storage section of this README for more information.

Example

@Cached(
  ttl: 60,
  syncWrite: true,
  limit: 100,
)
Future<int> getInt(String param) {
  return Future.value(1);
}

Example with getter

@cached
Future<int> get getter {
  return Future.value(1);
}

where

As mentioned before, where takes top-level function to check whether to cache value or not. It also supports async calls, so feel free to create conditional caching based on e.g. http response parsing.

sync example
@Cached(
  ttl: 60,
  syncWrite: true,
  limit: 100,
  where: _shouldCache
)
int getInt(String param) {
  return 1;
}

bool _shouldCache(int candidate) {
  return candidate > 0;
}
async example
@Cached(
  where: _asyncShouldCache,
)
Future<http.Response> getDataWithCached() {
  return http.get(Uri.parse(_url));
}

Future<bool> _asyncShouldCache(http.Response response) async {
  final json = jsonDecode(response.body) as Map<String, dynamic>;
  print('Up to you: check conditionally and decide if should cache: $json');

  print('For now: always cache');
  return true;
}

IgnoreCache

That annotation must be above a field in a method and must be bool, if true the cache will be ignored

Example use:

@cached
Future<int> getInt(String param, {@ignoreCache bool ignoreCache = false}) {
  return Future.value(1);
}

or you can use with useCacheOnError in the annotation and if set true then return the last cached value when an error occurs.

@cached
Future<int> getInt(String param, {@IgnoreCache(useCacheOnError: true) bool ignoreCache = false}) {
  return Future.value(1);
}

Possible reason why the generator gives an error

  • if method has multiple @ignoreCache annotation

Ignore

That annotation must be above a field in a method, arguments with @ignore annotations will be ignored while generating cache key.

Example use:

@cached
Future<int> getInt(@ignore String param) {
  return Future.value(1);
}

CacheKey

That annotation must be above a field in a method and must contain constant function that will return cache key for provided field value

Example use:

@cached
Future<int> getInt(@CacheKey(exampleCacheFunction) int test) async {
  await Future.delayed(Duration(milliseconds: 20));
  return test;
}

String exampleCacheFunction(dynamic value) {
  return value.toString();
}

You can also use @iterableCacheKey, which will generate cache key from Iterable<T> values

Example use:

@cached
Future<List<int>> getInt(@iterableCacheKey List<int> test) async {
  await Future.delayed(Duration(milliseconds: 20));
  return test;
}

ClearCached

Method decorator that flag it as needing to be processed by Cached code generator. Method annotated with this annotation can be used to clear result of method annotated with Cached annotation.
Constructor of this annotation can take one possible argument. It is method name, that we want to clear the cache.

Let say there is existing cached method:

@Cached()
Future<SomeResponseType> getUserData() {
  return userDataSource.getData();
}

to generate clearing cache method we can write:

@clearCached
void clearGetUserData();

or

@ClearCached('getUserData')
void clearUserData();

The ClearCached argument or method name has to correspond to cached method name. We can also create a method that returns a bool, and then write our own logic to check if the cache should be cleared or not.

@ClearCached('getUserData')
Future<bool> clearUserData() {
  return userDataSource.isLoggedOut();
}

If the user is logged out, the user cache will be cleared.

Possible reasons why the generator gives an error

  • if method with @cached annotation doesn’t exist
  • if method to pair doesn’t exist
  • if method don’t return bool, Future<bool> or not a void, Future<void>

ClearAllCached

This is exactly the same as ClearCached, except you don't pass any arguments and you don't add a clear statement before the method name, all you have to do is add @clearAllCached above the method, this annotation will clear cached values for all methods in the class with the @WithCache.

Here is a simple example:

@clearAllCached
void clearAllData();

or we can also create a method that returns a bool, and then write our own logic to check if cached values for all methods will be cleared

@clearAllCached
Future<bool> clearAllData() {
  return userDataSource.isLoggedOut();
}

If the user is logged out, will clear cached values for all methods

Possible reasons why the generator gives an error

  • if we have too many clearAllCached annotation, only one can be
  • if method don’t return bool, Future<bool> or not a void

StreamedCache

Use @StreamedCache annotation to get a stream of cache updates from a cached method. Remember to provide at least the name of the cached class method in the methodName parameter.

Simple example of usage:

@cached
int cachedMethod() {
  return 1;
}

@StreamedCache(methodName: "cachedMethod", emitLastValue: true)
Stream<int> cachedStream();

Method annotated with @StreamedCache should have same parameters (except @ignore or @ignoreCache) as method provided in methodName parameter, otherwise InvalidGenerationSourceError will be thrown. Return type of this method should be a Stream<sync type of target method> - for example for Future<String> the return type will be Stream<String>

Example:

@cached
Future<String> cachedMethod(int x, @ignore String y) async {
  await Future.delayed(Duration(miliseconds: 100));
  return x.toString();
}

@StreamedCache(methodName: "cachedMethod", emitLastValue: false)
Stream<String> cachedStream(int x);

CachePeek

Method decorator that flag it as needing to be processed by Cached code generator. Method annotated with this annotation can be used to peek result of method annotated with Cached annotation.

Constructor of this annotation can take one possible argument. It is method name, that we want to peek the cache.

Let say there is existing cached method:

@Cached()
Future<SomeResponseType> getUserData() {
  return userDataSource.getData();
}

to generate peek cache method we can write:

@CachePeek("getUserData")
SomeResponseType? peekUserDataCache();

The CachePeek methodName argument has to correspond to cached method name

Possible reasons why the generator gives an error

  • if more then one method is targeting [Cached] method cache
  • if method return type is incorrect
  • if method has different parameters then target function (excluding [Ignore], [IgnoreCache])
  • if method is not abstract

DeletesCache

@DeletesCache annotaton is a method decorator that marks method to be processed by code generator. Methods preceeded by this annotation clear the cache of all specified methods, annotated with @Cached, if they complete with result.

@DeletesCache annotation takes a list of cached methods that are affected by the use of annotated method, the cache of all specified methods is cleared on method success, but if an error occurs, the cache is not deleted and the error is rethrown.

If there is a cached method:

@Cached()
Future<SomeResponseType> getSthData() {
  return dataSource.getData();
}

Then a method that affects the cache of this method can be written as:

@DeletesCache(['getSthData'])
Future<SomeResponseType> performOperation() {
  ...
  return data;
}

All methods specified in @DeletesCache annotation must correspond to cached method names. If the performOperation method completes without an error, then the cache of getSthData will be cleared.

Throws an [InvalidGenerationSourceError]

  • if method with @cached annotation doesn't exist
  • if no target method names are specified
  • if specified target methods are invalid
  • if annotated method is abstract

Persistent storage

Cached library supports usage of any external storage (e.g. Shared Preferences, Hive), by using @PersistentCached() annotation:

Actual version

  @PersistentCached()
  Future<double> getDouble() async {
    return await _source.nextDouble() ;
  }

@Deprecated version

  @Cached(persistentStorage: true)
  Future<double> getDouble() async {
    return await _source.nextDouble() ;
  }

You only have to provide a proper interface by extending CachedStorage abstraction, e.g.:

...
import 'package:cached_annotation/cached_annotation.dart';

class MyStorageImpl extends CachedStorage {
  final _storage = MyExternalStorage();

  @override
  Future<Map<String, dynamic>> read(String key) async {
    return await _storage.read(key);
  }

  @override
  Future<void> write(String key, Map<String, dynamic> data) async {
    await _storage.write(key, data);
  }

  @override
  Future<void> delete(String key) async {
    await _storage.delete(key);
  }

  @override
  Future<void> deleteAll() async {
    await _storage.deleteAll();
  }
}

Now you have to assign instance of your class (preferably on the top of your main method):

...
import 'package:cached_annotation/cached_annotation.dart';

Future<void> main() async {
  WidgetsFlutterBinding.ensureInitialized();

  PersistentStorageHolder.storage = await MyStorageImpl();
  
  runApp(const MyApp());
}

As you can see above, Cached doesn't provide any generic way of error or typing handling. It'll just use PersistentStorageHolder.storage to save and read cached data from storage in generated code. You have to take care of it yourself inside your code.

Lazy persistent storage

Additional feature available only for @LazyPersistentCached() is initialize cache from external storage only after method call. This solution makes it possible to bypass a heavy initial load for large amounts of data.

  @LazyPersistentCached()
  Future<double> getDouble() async {
    return await _source.nextDouble() ;
  }

Direct persistent storage

When the @DirectPersistedCached annotation is used, it prevents the automatic loading of data from external storage into the cache managed by the caching library. For methods with this annotation, the library's generator neither creates a map for storing data fetched from the storage nor initializes such a map before the method's invocation. Consequently, setting this parameter ensures that data is always fetched directly from externalStorage upon method call. If the data is not already present in externalStorage, it is retrieved and then stored there.

  @DirectPersistedCached()
  Future<double> getDirectDouble() async {
    return await _source.nextDouble() ;
  }

Data saved to persistent storage can be deleted by using @ClearCached(), @ClearAllCached() or @DeletesCache annotations.

Usage of persistent storage does not change this library caching behaviour in any way. It only adds new capabilities, but it can affect the way in which you implement your app:

Important:

Please note, that persistent storage usage enforces you to provide async API when using Cached annotations!

For sample project, please check persistent_storage_example inside cached/example directory.

Contribution

We accept any contribution to the project!

Suggestions of a new feature or fix should be created via pull-request or issue.

feature request

  • Check if feature is already addressed or declined

  • Describe why this is needed

    Just create an issue with label enhancement and descriptive title. Then, provide a description and/or example code. This will help the community to understand the need for it.

  • Write tests for your feature

    The test is the best way to explain how the proposed feature should work. We demand a complete test before any code is merged in order to ensure cohesion with existing codebase.

  • Add it to the README and write documentation for it

    Add a new feature to the existing featrues table and append sample code with usage.

Fix

  • Check if bug was already found

  • Describe what is broken

    The minimum requirement to report a bug fix is a reproduction path. Write steps that should be followed to find a problem in code. Perfect situation is when you give full description why some code doesn’t work and a solution code.

Contributors

cached's People

Contributors

anowakiteo avatar axot017 avatar dependabot[bot] avatar devi88 avatar falynsky avatar grzegorz-kozlowski avatar jakubtiteo avatar jlukas99 avatar kamilraczkaiteo avatar lpusz avatar mateuszfilipek2000 avatar patrykpawlak avatar pawelpiechocinski avatar rsc-88 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

cached's Issues

catch22: abstract class vs mixin class

problem: when i generate stubs from the following code, i'll get a dart error:

@WithCache()
abstract class ChatAvatarUtils implements _$ChatAvatarUtils {
  factory ChatAvatarUtils() = _ChatAvatarUtils;
    @Cached(ttl: 300) // 5m only cause we only use it for our own avatar
    Future<String> toBase64({required String imageId}) async => '';
}

dart error:
The class 'ChatAvatarUtils' can't be used as a mixin because it's neither a mixin class nor a mixin.

however, making it a mixin class and running the generator produces another error:

@WithCache()
mixin class ChatAvatarUtils implements _$ChatAvatarUtils {
  factory ChatAvatarUtils() = _ChatAvatarUtils;
    @Cached(ttl: 300) // 5m only cause we only use it for our own avatar
    Future<String> toBase64({required String imageId}) async => '';
}

code generator error:

[ERROR] Class ChatAvatarUtils need to be abstract
package:u2nite/services/image.service.dart:76:13
   ╷
76 │ mixin class ChatAvatarUtils implements _$ChatAvatarUtils {
   │             ^^^^^^^^^^^^^^^
   ╵

what can i do?

$ flutter doctor
Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 3.10.0, on macOS 13.3.1 22E772610a darwin-arm64, locale en-US)
[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0-rc1)
[✓] Xcode - develop for iOS and macOS (Xcode 14.3)
[✓] Chrome - develop for the web
[✓] Android Studio (version 2022.1)
[✓] VS Code (version 1.78.1)
[✓] Connected device (4 available)
[✓] Network resources

• No issues found!

$ dart --version
Dart SDK version: 3.0.0 (stable) (Thu May 4 01:11:00 2023 -0700) on "macos_arm64"

Add `@CacheKey()` annotation

Add CacheKey annotation which will allow developers to pass function which will generate cache key (by default hashCode is used as a cache key).

  • Create @CacheKey annotation that will take one param with type String Function(dynamic)
  • Validate if @CacheKey is not used with @Ignore or @ignoreCache annotations
  • Create generation logic that will replace hashKey with result of function passed in @CachedKey annotation
  • Write generation tests
  • Write integration tests
  • Write dart doc for @cacheKey
  • Add section in README.md for @cacheKey

Example usage:

String getListCacheKeys(Iterable<Object> l) {
  return l.map((e) => e.hashCode).join('')
}

@withCache
class Example {
  String doSomething(@CacheKey(getListCacheKeys) List<SomeObject> l) {
    return 'something';
  }
}

To discus:
Maybe we should create also something like @iterableCacheKey which will handle case described in "example usage" as its quite generic.

Add `@Ignore()` annotation

Add annotation which will ignore method parameter while generating cache key.
TODO:

  • Add @ignore annotation i cached_annotation package
  • Ignore arguments with @ignore annotations while generating cache key
  • Add generation tests
  • Add integration tests
  • Add new annotation to readme
  • Write dartdoc for @ignore annotation

Add `@StreamedCache(...)` annotation

Add annotation which will allow to get cache of given method as a stream.

Todo:

  • Add @StreamedCache(...) annotation in cached_annotation package. Annotation should take 2 params: name of method which cache should be streamed, flag if last values should be emitted for new listener.
  • Add validation:
    • Given method need to exist
    • Given method need to have the same params (excluding prams with @igonore or @ignoreCache annotations) or no params (in this case last value will be emitted)
    • Method need to have return type of Stream of cached method sync type
  • Add generator
  • Add generation tests
  • Add integration tests
  • Add dartdoc for @StreamedCache annotation
  • Add description of @StreamedCache in README.md

Add ClearAllCached annotation

Todo:

  • Add ClearAllCached annotation in cached_annotation package
  • Add validation:
    • There should be only one method with this annotation
    • Return type should be void
    • Method shouldn't be async
  • Generate class implementation which will remove data from all cache maps

Usage proposition:

@withCache
abstract class Example implements _$Example {
   const factory Example() = _Example;
   ...

   @clearAllCached
   void clear();
}

To discuss: Should we allow clear method to have implementation or should we require it to be abstract?

persistentStorage option does not respect type of saved data

Hi, first of all, thanks for a great library.
However, when I am trying to read a cached value from persistent storage, during the first launch of the application, the data returned by the cached functions is of the wrong type.

To reproduce this error, it is crucial that the application is restarted, that is, the cache is retrieved from persistent storage and not from a variable (from a generated .cached.dart file).

Please let me know if you are accepting contributions from third party developers. I would like to fix this bug.

  @Cached(persistentStorage: true)
  Future<List<CarDto>> _getCars({@IgnoreCache(useCacheOnError: true) bool ignoreCache = true, }) async => carApiDataSource.getCars();

Error:

type 'List<dynamic>' is not a subtype of type 'FutureOr<List<CarDto>>'
#0      _CarRepositoryImpl._getCars (package:car_catalog/data/car/repository/car_repostiory_impl.cached.dart:63:18)
<asynchronous suspension>
#1      new TaskEither.tryCatch.<anonymous closure> (package:fpdart/src/task_either.dart:280:30)
<asynchronous suspension>
#2      TaskEither.match.<anonymous closure> (package:fpdart/src/task_either.dart:168:25)
<asynchronous suspension>
#3      CarListCubit.init (package:car_catalog/presentation/page/car_list_page/cubit/car_list_cubit.dart:18:5)
<asynchronous suspension>

Workaround
Please note that the type of return has changed from Future<List<CarDto>> to Future<List<dynamic>>

@Cached(persistentStorage: true)
Future<List<dynamic>> _getCars({@IgnoreCache(useCacheOnError: true) bool ignoreCache = true, }) async => carApiDataSource.getCars();

Cache storage implementation

@injectable
class CacheStorage extends CachedStorage {
  Future<Box<Map<dynamic, dynamic>>> _openBox(String key) => Hive.openBox(key);

  @override
  Future<void> write(String key, Map<dynamic, dynamic> data) async {
    final box = await _openBox(key);
    return box.put(key, data);
  }

  @override
  Future<Map<String, dynamic>> read(String key) async {
    final box = await _openBox(key);
    final Map<dynamic, dynamic> result = box.get(key) ?? {};
    final Map<String, dynamic> convertedValue = result.map((key, v) => MapEntry(key.toString(), v));
    return convertedValue;
  }

  @override
  Future<void> delete(String key) async {
    final box = await _openBox(key);
    return box.delete(key);
  }

  @override
  Future<void> deleteAll() async {
    //todo: implement deleteAll
  }
}

My env:
Flutter (Channel stable, 3.13.4, on macOS 14.0 23A344 darwin-arm64, locale en-PL)
cached_annotation: 1.6.0
build_runner: 2.4.6
cached: 1.6.0

Make compatible with analyzer ^6.0.0

Hi. Firstly, thanks for a great library.
I just however discovered that when updating to the latest freezed that this library is not supporting analyzer ^6.0.0. I get:

Because freezed >=2.4.2 depends on analyzer ^6.0.0 and cached 1.6.1 depends on analyzer ^5.2.0, freezed >=2.4.2 is incompatible with cached 1.6.1.
And because no versions of cached match >1.6.1 <2.0.0, freezed >=2.4.2 is incompatible with cached ^1.6.1.

Could you please release a version that can use the latest analyzer?
Thanks

LRU algorithm not implement

When set limit to 1 , the second item will never hit the cache! As show in the generated code, the last item will be removed!

  @Cached(
    syncWrite: true,
    limit: 1,
  )
if (_queryCached.length > 1) {
  _queryCached.remove(_queryCached.entries.last.key);
}

Use other cache libraries, e.g. `shared_preferences`, `hive`

Thank you so much for providing such great package! I'm using it a lot.

However, in some cases I need to store cache in more persistent place than memory, e.g. shared_preferences or hive.
Also, I think on the second/third/etc application boot using saved cache in shared_preferences is better than making call the API during the initialization.

It seems not possible for now using your package. So, do you have any plans or workarounds to support using not only in-memory cache?

Add dartdoc comments

Todo:

  • Add dartdoc comments for cache package
    • Add dartdoc comment for cachedBuilder (some like "it's function used by build runner", it's not important, just go get points on pub.dev)
  • Add dartdoc comments for cache_annotation package
    • Add dartdoc comment for WithCache annotation
    • Add dartdoc comment for Cached annotation
    • Add dartdoc comment for IgnoreCache annotation
    • Add dartdoc comment for ClearAllCached annotation (not implemented yet)
    • Add dartdoc comment for ClearCached annotation (not implemented yet)

Create cached_annotation README.md

Create cached_annotation README.md file which will contain following:

  • Information that package contains only annotations and is useless without cached package
  • Link to cached package

Add ClearCached annotation

Todo:

  • Add ClearCached annotation in cached_annotation package:
    • Class should have one positional string argument which should be name of method for which caches should be cleared
  • Add validation:
    • Validate if method passed as argument exist
    • Validate if there are no multiple methods which ClearCached annotation and the same argument
    • Validate if method is void
    • Validate if method is not async
    • Validate if method has no arguments
  • Generate method body which will remove cache of passed method

Proposition:

@withCache
abstract class Example implements _$Example {
   const factory Example() = _Example;
   ...

   @cached
   String getString() {
      // some logic
      return result;
   }

   @ClearCached('getString')
   void clearStrings();
}

To discuss: Should we allow clear method to have implementation or should we require it to be abstract?

Additional requirements after discussion:

  • Method can have implementation
    • If method is not abstract it can return bool or void
    • If method is not abstract super.method() is called
    • If method is not abstract it can has argument
    • If method is not abstract and returns bool cache is cleared only if super.method() call returns true
    • Non abstract methods can be also async

Cache peek

It would be nice to have a way to peek at a value in the cache. One possible API design is:

@Cached()
Future<Result> getResult(params...) { ... }

@CachePeek(methodName: 'getResult')
Result? getResultCached(params...);

This method returns either the cached result or null if no result is cached.

[Feature]: Allow editing cache

Problem

When creating/updating data on a remote data source developers might want to update the local cache after that.

Desired Solution

In the most basic cases something like this could solve it.

Create:

// foo_repository.dart

  @EditCache
  Future<Foo> createFoo(Foo foo) async {
    await dataSource.create(foo);
    return foo;
  }

leading to

// foo_repository.cached.dart

  @override
  Future<Foo> createFoo(Foo foo) async {
    final result = await super.createFoo(foo);
    _getFooCached["${foo.hashCode}"] = result;
    return result;
  }

Update:

// foo_repository.dart

  Future<Foo> getFoo() async {
    // ...
  }

  @EditCache
  Future<Foo> updateFoo(int a, String b) async {
    await dataSource.update(a, b);
    final foo = getFoo();
    return foo.copyWith(a:a, b:b);
  }

leading to

// foo_repository.cached.dart

  @override
  Future<Foo> updateFoo(int a, String b) async {
    final result = await super.updateFoo(a, b);
    _getFooCached["${foo.hashCode}"] = result;
    return result;
  }

Alternatives Considered

No response

On which platorm do you expect this solution?

All

Add a way to prevent caching specific results

It would be useful to have a way inside a cached function to signal that a certain result must not be cached.
In particular we are caching results from a server, which can in some situations return data that is explicitly not allowed to be cached. But this can only be known after the data was loaded, not before issuing the call. Thus ignoreCache can not be used to solve the issue here.

Any thoughts on a nice way to support this? In our case we could for example pass a list of types that must not be cached.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.