
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
@unleashit/fetch-cache
Advanced tools
Isomorphic fetch wrapper and promise cache with React Server Components in mind
Wasn't satisfied with the black boxed and inconsistent way of data fetching in the new Next JS 13 app directory, so I made this. This is an attempt to combine a fetch wrapper similar to the one provided by Next.Js with other data fetching methods like raw DB queries or anything that returns a promise, into a single deduping/caching api. In addition, it is browser compatible and offers a bit more control and insight into what is being cached.
This is an experimental package that is geared towards Next.JS and React server components (but not a requirement). Yes, it does in part reinvent the wheel of Next's fetch deduping, but it also allows for better consistency and adds some interesting features. There is a small caveat. If you use this with Next 13 RSCs, Next's patched fetch isn't prevented from also memoizing its values underneath this cache. While that may "feel" a bit ugly, it shouldn't have any measurable drawbacks and all of your Next options (including route segment configs) will be respected. Ideally they wouldn't have overwritten the native fetch, but it is what it is.
React.cache)memo method to cache any other type of promiseget requests or the provided memo can be cached. Other HTTP verbs are pass through.npm install @unleashit/fetch-cache
In Next JS 13 (app directory enabled), Fetch is patched to work down to 16.8. React.cache is currently only available in experimental or Next 13 builds.
If you are using this with Next JS and React Server Components, it's suggested to initialize fetch-cache within React.cache and import from a separate file. Since Next.Js has decided not to provide full access to the Request/Response objects (a mini tragedy if you ask me), React provides this helper which makes use of Async Context. You can for example use it to cache an instance of fetch-cache for the lifetime of each request (and throw out after).
// services.ts
import FetchCache from "@unleashit/fetch-cache";
import { cache } from "react";
const baseurl = "https://amazing-products.com";
export const api = cache(() => new FetchCache({ baseurl }));
To use isomorphically, the setup can be a bit particular because you only want to wrap fetch-cache in React.cache on the server. One option is to create separate files for client and server. But you can manage within a single file like this:
const clientInstance =
typeof window !== "undefined" &&
// to use cache on the client, you can set a default as shown here
// and/or override in individual GET requests
new FetchCache({ baseurl, defaultCacheTime: 1800 });
export const api = typeof window === "undefined"
? cache(() => new FetchCache({ baseurl }))
: () => clientInstance;
Keep in mind
React.cachereturns a function. The reason for also exporting the client instance as a function is just to maintain consistent calling syntax in both environments. A workaround to prevent the client function from creating a new instance each time is to initialize it in a separate variable. Unfortunately, simply memoizing both withReact.cachewon't work on the client since it would reinstantiate on page changes.
Of course if you're not using Next 13 or React or want it only on the client, the above doesn't apply. If you have a custom server and want a fresh cache with each request, either add the instance to your request context or reset the cache in an early middleware with api.invalidate('*').
Retrieving data is similar to how Next recommends using fetch. If you need the same data in multiple places, rather than prop drilling, just await the promise in each place you need it. Thanks to the cache, it will only actually be called once.
As a convenience, fetch-cache handles the double promise and error states. By default it will assume and attempt to return JSON, but you can specify whatever format you need (text, blob, etc.). If the response contains a non-2xx status code, a FetchCacheError is thrown with the original Response and status. A general failure like a network issue throws the standard error.
Don't forget if you've exported a function that returns the instance, you have to call it first, either in advance or inline with each use.
import api from './services';
// React Server Component example. On the client (or server without RSCs),
// call inside a use hook (once stable) or useEffect.
async function Page() {
// Notice the api().get syntax. This is because api is exported as a function
const products = await api<Products[]>().get('/products');
return (
<>
{
products.map(product => (
<h3>{product.title}</h3>
// ...
))
}
</>
)
}
// pass fetch options as normal, including Next's caching params
async function Page() {
const products = await api<Products[]>()
.get('/products', { cache: "force-cache", next: { revalidate: 60 } });
// ...
}
Keep in mind passing
optsto fetch methods will be shallowly merged with defaults. For example, if you pass any custom headers, be sure to also includeContent-Type: application/jsonif you need it.
new FetchCache(options)Creates a new instance of fetch cache. Note that by default, defaultCacheTime is set to 0. This is only for the client (server is always cached per request), so if you want a cache on the client, you need to either specify a default here and/or override in individual fetches.
type options = {
baseurl: string;
debug?: boolean | 'verbose'// false,
defaultCacheTime?: number // 0 (seconds)
maxCacheSize?: number // 200 (-1 for no limit)
}
api.get(options)type ResponseTypes = "json" | "text" | "blob" | "arrayBuffer" | "formData"; // default is "json"
// NextExtendedFetchConfig is intersection of standard fetch request options with Next's
type FetchOpts = NextExtendedFetchConfig & { responseType?: ResponseTypes };
type GetHandlerArgs = string // pathname
| {
pathName: string;
opts?: FetchOpts;
cacheTime?: number; // seconds
};
If sent an opts object, it will be shallow merged over some common defaults and passed to Fetch.
By default, fetch will call and return response.json() after the initial response. Set the responseType property if you expect another data type.
api.post(options) api.put(options) api.patch(options) api.delete(options) api.head(options) api.options(options)type OtherMethodArgs = [
pathName: string,
opts?: FetchOpts // same as get request (see above)
];
Same as GET except the other methods use a second argument for the options (and no cacheTime). Expects a JSON response by default, but can changed (seeapi.get).
api.memo(options)type options = [key: string, fn: <T>() => Promise<T>, cacheTime?: number]; // cacheTime defaults to `defaultCacheTime` or 0
Accepts and optionally caches any type of promise. You can for example use it to memoize a database request, or anything else that returns a promise. The behavior is otherwise the same as api.get(). Both methods always cache on the server. So as with fetch, you can await the same memoed promise in different parts of the app and they will be deduped as long as they share the same key name. On the client (also like fetch), memoed promises are only actually cached if cacheTime is set. You can set it as a default when you initialze fetch-cache, and/or provide as a third argument in api.memo(). The latter overrides any default.
async function Page() {
// Notice the api().memo syntax. This is because api is exported as a function
const result = await api<{ rows: { message: string }[] }>()
.memo(() => client.query('SELECT $1::text as message', ['Hello world!']));
return (
<div>The message for today is: {result.rows[0].message}</div>
)
}
Note that any cache is shared between
api.memo()andapi.get(). A name collision is unlikely since the key names created byapi.get()are URL paths, but good to keep in mind.
api.invalidate(key)Purges the specified key (pathname) from cache. Clears entire cache when passed an asterisk (*),
type key = string;
api.getCacheStats()Returns object with two properties: values (Map object of the current cache) and size (cache size).
api.logCache()Logs cache stats to stdout for debugging. Will output more verbose if API was instantiated with debug: verbose'.
FAQs
Isomorphic fetch wrapper and promise cache with React Server Components in mind
We found that @unleashit/fetch-cache demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.