
Product
Introducing Webhook Events for Alert Changes
Add real-time Socket webhook events to your workflows to automatically receive software supply chain alert changes in real time.
@equinor/fusion-query
Advanced tools
Reactive data fetching and caching library with observable streams and comprehensive event system
The primary use case for Query involves:
Using a Query mechanism offers numerous benefits, including:
Query libraries often come with features to automatically refetch data on certain triggers (e.g., window focus), ensuring the UI is always up-to-date with the latest server state without manual intervention.Query implementations are usually highly customizable, allowing developers to tailor their behavior for specific needs, such as custom caching strategies, query deduplication, and more.When setting up a Query, you can typically configure it with several options to tailor its behavior to your application's specific needs. While the exact options available can vary depending on the implementation of the Query, common configuration parameters often include:
fn (Function):
fetch, Axios, or any other HTTP client.Retry Strategy:
Caching Strategy:
expire: Time in milliseconds after which a cached item is considered stale.Concurrent Request Handling (Queuing Strategy):
switch: Cancels any ongoing request when a new request comes in.merge: Allows multiple requests to run in parallel.concat: Queues requests and executes them sequentially.Event System:
event$) that emits lifecycle events for monitoring query execution, caching behavior, and debugging. Events include query creation, cache hits/misses, job execution stages, and completion states.Request Transformation:
Response Transformation:
Error Handling:
signal (AbortSignal):
AbortSignal to requests, allowing you to cancel them programmatically if needed.Extended Configuration:
const queryClient = new QueryClient({
fn: async (args) => {
const response = await fetch(`https://your.api/${args.endpoint}`, {
method: 'GET',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(args.params),
});
if (!response.ok) {
throw new Error('Network response was not ok');
}
return response.json();
},
expire: 60000, // Cache data expires after 60 seconds
retry: { attempts: 3, delay: 1000 }, // Retry up to 3 times, 1 second apart
// Additional config options as necessary...
});
const args = {
/* your query arguments here */
};
query.query(args).subscribe({
next: (result) => console.log(result),
error: (error) => console.error(error),
complete: () => console.log('Query completed'),
});
const args = {
/* your query arguments here */
};
query
.queryAsync(args)
.then((result) => console.log(result))
.catch((error) => console.error(error));
const args = {
/* identify cache entry */
};
const changes = (prevState) => ({ ...prevState /* new state changes */ });
query.mutate(args, changes);
If you're confident about the new state of the data after the mutation, you can apply an optimistic update:
// Assume a function to update a post returns the updated post data
async function updatePost(postId, newData) {
const updatedPost = await apiUpdatePost(postId, newData); // perform API request to update the post
return updatedPost;
}
// Optimistically updating the cache with the new post data
fastQueryClient.mutate(['post', postId], async (oldData) => {
const updatedData = await updatePost(postId, newData);
return { ...oldData, ...updatedData, updated: Date.now() };
});
In this case, by providing the updated attribute with the current timestamp, FastQuery knows that the cached data is fresh, preventing unnecessary refetches.
If the final state of the data after the mutation is uncertain or if it's preferable for the application to fetch fresh data from the server, you can choose to invalidate the cache item:
// Invalidate the cache item for the post, forcing a refetch next time
fastQueryClient.mutate(['post', postId], oldData => {
// Perform the mutation without directly updating the cache data
updatePost(postId, newData);
// Return null or undefined, or simply omit the updated attribute
// This marks the cache item as stale
return { ...oldData, updated: undefined }; // Omitting or setting undefined explicitly
});
Invalidate a specific cache entry or all:
query.invalidate(args); // Invalidates specific entry
query.invalidate(); // Invalidates all cache entries
Remember to unsubscribe from observables or to complete the query to release resources:
const subscription = query.query(args).subscribe((result) => console.log(result));
// Later, when you're done:
subscription.unsubscribe();
// Or, to complete and clean up the query itself:
query.complete();
The Query package provides a comprehensive event system that aggregates events from three sources: the Query instance itself, the QueryClient (handling data fetching), and the QueryCache (managing cached data). All events are emitted through the event$ observable stream, allowing you to monitor and debug the entire query lifecycle in real-time.
Events emitted by the Query instance itself:
query_created: Fired when a new query is createdquery_completed: Fired when a query completes successfullyquery_connected: Fired when connecting to an existing taskquery_queued: Fired when a query is queued for executionquery_aborted: Fired when a query is abortedquery_cache_hit: Fired when cached data is usedquery_cache_miss: Fired when no valid cache is foundquery_cache_added: Fired when data is added to cachequery_job_created: Fired when a new job is createdquery_job_selected: Fired when a job is selected for executionquery_job_started: Fired when job execution beginsquery_job_closed: Fired when a job is closedquery_job_completed: Fired when a job completesquery_job_skipped: Fired when a job is skippedEvents emitted by the QueryClient during data fetching operations:
query_client_job_requested: Fired when a job is requested with arguments and optionsquery_client_job_executing: Fired when query execution beginsquery_client_job_completed: Fired when a query completes successfully with result payloadquery_client_job_failed: Fired when query execution fails with an errorquery_client_job_canceled: Fired when a query is canceled with a reasonquery_client_job_error: Fired when a general error occurs during query processingEvents emitted by the QueryCache during cache operations:
query_cache_entry_set: Fired when a cache entry is set with a complete recordquery_cache_entry_inserted: Fired when a cache entry is inserted with new dataquery_cache_entry_removed: Fired when a cache entry is removedquery_cache_entry_invalidated: Fired when a cache entry is invalidatedquery_cache_entry_mutated: Fired when a cache entry is mutated/updatedquery_cache_trimmed: Fired when the cache is trimmed based on criteriaquery_cache_reset: Fired when the cache is reset to its initial stateimport { filter } from 'rxjs';
const query = new Query({
client: { fn: myFetchFunction },
key: (args) => JSON.stringify(args),
});
// Subscribe to all events
query.event$.subscribe({
next: (event) => {
console.log(`Event: ${event.type}`, event);
},
error: (error) => console.error('Event error:', error),
});
// Subscribe to specific event types
query.event$.pipe(
filter(event => event.type === 'query_cache_hit')
).subscribe(event => {
console.log('Cache hit!', event.data);
});
// Filter events by source using instanceof
import { QueryEvent } from '@equinor/fusion-query';
import { QueryClientEvent } from '@equinor/fusion-query/client';
import { QueryCacheEvent } from '@equinor/fusion-query/cache';
const cacheEvents$ = query.event$.pipe(
filter(event => event instanceof QueryCacheEvent)
);
const clientEvents$ = query.event$.pipe(
filter(event => event instanceof QueryClientEvent)
);
const queryEvents$ = query.event$.pipe(
filter(event => event instanceof QueryEvent)
);
Each event includes:
type: The event type identifierkey: The cache key for the querydata: Type-safe event-specific data (varies by event type)The Query utility allows you to manage concurrent requests using different queue strategies: switch, merge, and concat. Here's how you can apply each strategy:
Cancels the current active request when a new request comes in. Only the result from the latest request will be returned.
import { Query } from '@equinor/fusion-query';
import { debounceTime, fromEvent, map, switchMap } from 'rxjs';
// Mock function to simulate data fetching based on the search query
async function fetchSearchResults(searchQuery: string) {
const response = await fetch(`https://your.api/search?query=${searchQuery}`);
if (!response.ok) {
throw new Error('Network response was not ok');
}
return response.json();
}
// Define the query with the switch as a debounce strategy
const searchQuery = new Query<any, { searchQuery: string }>({
client: {
fn: ({ searchQuery }) => fetchSearchResults(searchQuery),
},
key: ({ searchQuery }) => `search-${searchQuery}`,
queueOperator: 'switch',
});
Now, set up the search input to listen for changes and use the defined query to fetch data:
const searchInput = document.getElementById('search-input');
fromEvent(searchInput, 'input')
.pipe(
map((event) => (event.target as HTMLInputElement).value),
debounceTime(300), // Debounce typing to limit queries
switchMap((searchQuery) => (searchQuery ? searchQuery.query({ searchQuery }) : [])),
)
.subscribe({
next: (results) => {
console.log('Search Results:', results);
// Handle rendering the search results here
},
error: (error) => console.error('Error fetching search results:', error),
});
Allows multiple requests to run in parallel without canceling each other. All responses will be returned as they arrive.
async function fetchData(endpoint: string, queryParams: object) {
const response = await fetch(`https://your.api/${endpoint}`, {
method: 'GET',
body: JSON.stringify(queryParams),
});
if (!response.ok) {
throw new Error('Network response was not ok');
}
return response.json();
}
import { Query } from '@equinor/fusion-query';
import { combineLatest } from 'rxjs';
// Defining the query with `fn` function for client creation
const userProfileQuery = new Query<any, { endpoint: string; queryParams: object }>({
client: {
fn: fetchData,
},
key: (args) => `${args.endpoint}-${JSON.stringify(args.queryParams)}`,
queueOperator: 'merge', // Using merge to handle parallel queries
});
const userId = 'exampleUserId';
// Initiating parallel requests
const userDetails$ = userProfileQuery.query({ endpoint: 'users', queryParams: { userId } });
const userPosts$ = userProfileQuery.query({ endpoint: 'posts', queryParams: { userId } });
const userComments$ = userProfileQuery.query({ endpoint: 'comments', queryParams: { userId } });
// Combining the observables from parallel requests
combineLatest([userDetails$, userPosts$, userComments$]).subscribe({
next: ([userDetails, userPosts, userComments]) => {
// Handle and display the combined data as needed
console.log('User Details:', userDetails);
console.log('User Posts:', userPosts);
console.log('User Comments:', userComments);
},
error: (error) => console.error('Error fetching data:', error),
complete: () => console.log('All parallel queries completed'),
});
Queues requests and executes them one after another in a sequential manner. A new request will only start after the previous one has completed.
const query = new Query<YourDataType, YourArgsType>({
client,
key: (args) => JSON.stringify(args),
queueOperator: 'concat',
});
Cache validation is crucial for determining whether cached data is still relevant or needs to be refreshed. You can customize cache validation using the validate option.
Automatically consider cache entries as stale after a specified duration.
const query = new Query<YourDataType, YourArgsType>({
client,
key: (args) => JSON.stringify(args),
expire: 60000, // 60 seconds
});
Implement a custom logic to validate cache entries based on your requirements.
const query = new Query<YourDataType, YourArgsType>({
client,
key: (args) => JSON.stringify(args),
validate: (entry, args) => {
// Your custom validation logic here.
// For example, return `false` if the entry is older than 30 minutes.
return Date.now() - entry.updated < 30 * 60 * 1000;
},
});
First, establish a shared QueryCache instance and define a common function to fetch data. This shared cache will be utilized by different query instances across the application.
import { Query, QueryCache } from '@equinor/fusion-query';
// Initialize a shared QueryCache across the application
const sharedQueryCache = new QueryCache();
// Function to fetch blog posts data from your API
async function fetchBlogPosts() {
const response = await fetch('https://example.com/api/blog-posts');
if (!response.ok) {
throw new Error('Failed to fetch blog posts');
}
return response.json();
}
Next, directly initialize separate Query instances in different parts of your application (e.g., for homepage posts and sidebar posts). These instances will share the same QueryCache but are created independently.
// Create a Query instance for the homepage using the shared cache
const homepagePostsQuery = new Query({
client: {
fn: fetchBlogPosts,
},
cache: sharedQueryCache, // Utilize the shared cache
key: () => 'allBlogPosts', // Unique key for this query
});
// Fetch and render posts on the homepage
homepagePostsQuery.query().subscribe({
next: (posts) => {
// Render posts on the homepage
console.log('Homepage posts:', posts);
},
error: (error) => console.error('Error fetching posts for homepage:', error),
});
// Create a Query instance for the sidebar using the same shared cache
const sidebarPostsQuery = new Query({
client: {
fn: fetchBlogPosts,
},
cache: sharedQueryCache,
key: () => 'allBlogPosts', // Same key, leveraging cache from homepage query
});
// Fetch and display posts in the sidebar widget
sidebarPostsQuery.query().subscribe({
next: (posts) => {
// Render posts in the sidebar
console.log('Sidebar posts:', posts);
},
error: (error) => console.error('Error fetching posts for sidebar:', error),
});
Since both homepagePostsQuery and sidebarPostsQuery instances use the shared QueryCache, fetching operations benefit from cache-first strategies, reducing unnecessary network requests.
"allBlogPosts" data is fetched first by the homepage and then requested by the sidebar, the sidebar will immediately access the cached data without needing to fetch from the network again.FAQs
Reactive data fetching and caching library with observable streams and comprehensive event system
The npm package @equinor/fusion-query receives a total of 1,476 weekly downloads. As such, @equinor/fusion-query popularity was classified as popular.
We found that @equinor/fusion-query demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Add real-time Socket webhook events to your workflows to automatically receive software supply chain alert changes in real time.

Security News
ENISA has become a CVE Program Root, giving the EU a central authority for coordinating vulnerability reporting, disclosure, and cross-border response.

Product
Socket now scans OpenVSX extensions, giving teams early detection of risky behaviors, hidden capabilities, and supply chain threats in developer tools.