One way to avoid network waterfalls is to ensure we fetch our data as early as possible. Ideally, this data preload should happen when loading our route or even before.
While this is helpful, data preloading can accidentally trigger duplicate requests, so some sort of request deduplication might be helpful.
In this lesson, we will learn how to leverage the route preload functionality to prefetch some data and then wrap these functions with cache
to perform request deduplication.
[00:00] To improve our user experience, we want to fetch our data as early as possible. To do that in solid start, we can leverage the preload function. To add a preloader to our route, all we need to do is export route, and inside call a preload function with the getUsers being called as well. Now this route data will be fetched in parallel to the route [00:19] loading or eagerly when the links are overt. Let's do the same for our dynamic route. Export route, call the preloader, call our get user function. Now we still need our ID parameter to be able to fetch the right user data, so let's access them from our preload function and pass them to the getUser. Finally, [00:39] let's import route definition from Solid Router to ensure the typings are the expected ones. To ensure our data is being preloaded, let's add a console login both functions and run our application. As you can see, the preloaders are working. Now this is leading to another issue. Every time we visit [00:59] a route, there are duplicate requests. One because of the preloader and another because of create a sync. This is the ideal use case to introduce a caching layer. We want to cache our data for the first time it's requested. And if another resource tries to access it in the following seconds, the data is returned from the cache instead of being refreshed. Before [01:19] adding our cache, let's move our use server directive inside of each functions so that when we add the cache, it happens on a browser level. Then let's import cache from solid router and wrap get users with it. Let's pass it a cache key that will be used to cache the data underneath it. Then let's also wrap our get user with cache and pass it the [01:39] cache key. Every time the data is returned from this function, it will be cached underneath this cache key, which is user plus its ID. Now if we check our app, we should see that only one request happens every time we interact with the data fetching resources.
Member comments are a way for members to communicate, interact, and ask questions about a lesson.
The instructor or someone from the community might respond to your question Here are a few basic guidelines to commenting on egghead.io
Be on-Topic
Comments are for discussing a lesson. If you're having a general issue with the website functionality, please contact us at support@egghead.io.
Avoid meta-discussion
Code Problems?
Should be accompanied by code! Codesandbox or Stackblitz provide a way to share code and discuss it in context
Details and Context
Vague question? Vague answer. Any details and context you can provide will lure more interesting answers!