Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow possibility to resolve data synchronously #155

Open
Kaltsoon opened this issue Sep 30, 2019 · 5 comments
Open

Allow possibility to resolve data synchronously #155

Kaltsoon opened this issue Sep 30, 2019 · 5 comments

Comments

@Kaltsoon
Copy link

Kaltsoon commented Sep 30, 2019

Would it be possible to somehow provide Async and useAsync a way to resolve data synchronously? This would allow the use of any synchronous caching solution and eliminate rerenders caused by status updates. At the current state of the library, I have only come up with a solution that memoizes the promiseFn function, however it is still asynchronous.

This might not be the best API, but for the sake of an example case:

const useTodo = todoId => {
  const todoClient = useTodoClient();
  const cache = useCache();

  const {  data, ...rest } = useAsync({
    promiseFn: todoClient.getById,
    cacheFn: ({ id }) => cache.get(`todo.${id}`), // Synchronous way to resolve the data
    id: todoId,
    watch: todoId,
  });

  return { todo: data, ...rest };
};

Optional cacheFn argument would introduce a way to resolve data synchronously and it should be the primary way to resolve the data (secondary would be to use the good old promiseFn). This way no additional renders are required in case the fn function can resolve the data (it returns something else than undefined). cacheFn function would receive the same arguments as the promiseFn function.

Because this solution is quite flexible, we could easily disable cache by conditionally providing the cacheFn function:

useAsync({
  promiseFn: todoClient.getById,
  cacheFn: cacheEnabled ? ({ id }) => cache.get(`todo.${id}`) : undefined,
  id: todoId,
  watch: todoId,
});

If this can be done another way, without introducing a new argument I would gladly hear about it.

Edit: Renamed fn to cacheFn, because it describes the usage better.

@ghengeveld
Copy link
Member

ghengeveld commented Oct 6, 2019

How would values be added to the cache? With cacheFn built like that, the only way for stuff to be added is through onResolve or in the promiseFn itself, manually. I think that's fine, but we should make a conscious decision if React Async is not going to add stuff to the cache automatically. We can of course always add a cache prop later that handles both reading and writing automatically.

In a future version, I'm considering renaming promiseFn to just fn and have it support both synchronous and asynchronous results.

@Kaltsoon
Copy link
Author

Kaltsoon commented Oct 7, 2019

I would leave it up to the user of the library. I, for myself, created a useCachingAsync hook, which wraps given promiseFn with a simple LRU cache and provides the cacheFn function. Here is my implementation. Note that my useAsync API is slightly different than the one in this library.

I like this library because it doesn't try to do everything and I think that's the way to go in the future aswell. It is quite simple to create your own hooks that utilize the useAsync hook somehow, such as useCachingAsync.

@Kaltsoon
Copy link
Author

Kaltsoon commented Oct 7, 2019

The fn function support you mentioned sounds also very promising (or even better than my suggestion)! Still, I would prefer to exclude any direct cache support, because it would bring more unnecessary complexity to the library.

@ghengeveld
Copy link
Member

I think direct cache support depends on where we're going with Suspense support. Suspense was designed to be used with a cache. It works without one, but it's foregoes the render optimization that you can get by synchronously resolving the data. Personally I don't care too much about some additional renders, but some people tend to make a big deal about it.

@Kaltsoon
Copy link
Author

Kaltsoon commented Oct 7, 2019

I think that saving a single render (when using cache with Suspense) could be considered a nice to have feature. That being said, perhaps caching shouldn't be integrated into the library but it should still be relatively easy to implement using react-asyncAPIs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants