
Security News
Browserslist-rs Gets Major Refactor, Cutting Binary Size by Over 1MB
Browserslist-rs now uses static data to reduce binary size by over 1MB, improving memory use and performance for Rust-based frontend tools.
fetch-resilient
Advanced tools
Fetch Resilient is a lightweight and powerful TypeScript library for making HTTP requests with advanced features such as retries, caching, throttling, and debouncing. It's designed to enhance the reliability and performance of your web applications while
Fetch Resilient is a lightweight and powerful TypeScript library for making HTTP requests with advanced features such as retries, caching, throttling, and debouncing. It's designed to enhance the reliability and performance of your web applications while keeping things simple.
fetch
APInpm install fetch-resilient
Here's a simple example of how to use Fetch Resilient:
import { httpClient } from 'fetch-resilient';
async function fetchData() {
try {
const data = await httpClient.fetch<{ message: string }>('https://api.example.com/data', {
method: 'GET',
});
console.log(data.message);
} catch (error) {
console.error('Error fetching data:', error);
}
}
fetchData();
You can set up global configuration for all requests using the updateConfig
method:
import { httpClient } from 'fetch-resilient';
httpClient.updateConfig({
maxRetries: 3,
initialBackoff: 1000,
maxBackoff: 10000,
backoffFactor: 2,
retryOnErrors: [500, 502, 503, 504],
withCache: true,
cacheTTL: 60000, // Cache for 1 minute
onRetry: (attempt, url, options) => {
console.log(`Retrying request (attempt ${attempt}): ${url}`);
},
onHttpResponse: (response) => {
console.log(`Response status: ${response.status}`);
},
onSuccess: (data, response) => {
console.log('Request successful');
return data;
},
onError: (error, attempt) => {
console.error(`Error on attempt ${attempt}:`, error);
},
});
If you prefer more control, you can use the ResilientHttpClient
class directly:
import { ResilientHttpClient } from 'fetch-resilient';
const client = ResilientHttpClient.getInstance({
maxRetries: 3,
initialBackoff: 500,
maxBackoff: 10000,
backoffFactor: 2,
retryOnErrors: [404, 500],
withCache: true,
cacheTTL: 60000, // Cache for 1 minute
});
async function fetchData() {
try {
const data = await client.fetch<{ message: string }>('https://api.example.com/data', {
method: 'GET',
});
console.log(data.message);
} catch (error) {
console.error('Error fetching data:', error);
}
}
fetchData();
Fetch Resilient offers a wide range of configuration options:
const client = ResilientHttpClient.getInstance({
maxRetries: 3,
initialBackoff: 500,
maxBackoff: 10000,
backoffFactor: 2,
retryOnErrors: [404, 500],
isTextResponse: false,
isJsonResponse: false,
responseType: 'auto',
withCache: true,
cacheTTL: 60000, // Cache for 1 minute
throttleTime: 1000,
debounceTime: 0,
onRetry: (attempt, url, options) => {
console.log(`Retrying request (attempt ${attempt}): ${url}`);
},
onHttpResponse: (response) => {
console.log(`Response status: ${response.status}`);
},
onSuccess: (data, response) => {
console.log('Request successful');
return data;
},
onError: (error, attempt) => {
console.error(`Error on attempt ${attempt}:`, error);
},
});
To enable caching, use the withCache
option:
const data = await client.fetch<UserData>('https://api.example.com/user/1',
{ method: 'GET' },
{ withCache: true, cacheTTL: 60000 } // Cache for 1 minute
);
To throttle requests, use the throttleTime
option:
const client = ResilientHttpClient.getInstance({
throttleTime: 5000, // Allow one request every 5 seconds
});
// This will execute immediately
client.fetch('https://api.example.com/data1');
// This will be delayed by 5 seconds
client.fetch('https://api.example.com/data2');
To debounce requests, use the debounceTime
option:
const client = ResilientHttpClient.getInstance({
debounceTime: 1000, // Wait for 1 second of inactivity before sending the request
});
// Only the last call within the debounce time will be executed
client.fetch('https://api.example.com/search?q=test1');
client.fetch('https://api.example.com/search?q=test2');
client.fetch('https://api.example.com/search?q=test3');
Fetch Resilient provides flexible error handling:
const client = ResilientHttpClient.getInstance({
onError: (error, attempt) => {
if (attempt === 3) {
// Custom logic for final retry attempt
console.error('Final retry attempt failed:', error);
}
// You can modify the error or perform additional actions
return new Error(`Custom error: ${error.message}`);
},
});
Fetch Resilient is written in TypeScript and provides full type support:
interface User {
id: number;
name: string;
email: string;
}
const user = await client.fetch<User>('https://api.example.com/user/1');
console.log(user.name); // TypeScript knows the shape of 'user'
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License.
FAQs
Fetch Resilient is a lightweight and powerful TypeScript library for making HTTP requests with advanced features such as retries, caching, throttling, and debouncing. It's designed to enhance the reliability and performance of your web applications while
The npm package fetch-resilient receives a total of 4 weekly downloads. As such, fetch-resilient popularity was classified as not popular.
We found that fetch-resilient demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Browserslist-rs now uses static data to reduce binary size by over 1MB, improving memory use and performance for Rust-based frontend tools.
Research
Security News
Eight new malicious Firefox extensions impersonate games, steal OAuth tokens, hijack sessions, and exploit browser permissions to spy on users.
Security News
The official Go SDK for the Model Context Protocol is in development, with a stable, production-ready release expected by August 2025.