Effect HTTP Requests Rate Limiter

Description
Intelligent HTTP request rate limiter for Effect with dynamic gate control, quota monitoring, and smart delay optimization.
Features:
- 🚪 Smart Gate Control: Auto-manages request flow based on rate limit headers and 429 responses
- ⚡ Optimized Delays: Minimizes cascading waits for concurrent requests
- 📊 Quota Monitoring: Proactive handling using relevant headers
- 🔄 Flexible Retries: Configurable retry policies with exponential backoff
- 🎛️ Effect Integration: Works with any Effect RateLimiter
- 🚦 Concurrency Control: Semaphore-based request limiting
Installation
pnpm i effect-requests-rate-limiter
Usage
import { DevTools } from "@effect/experimental"
import { HttpClientRequest } from "@effect/platform"
import { NodeHttpClient, NodeRuntime } from "@effect/platform-node"
import { Duration, Effect, Layer, pipe, RateLimiter, Schedule, Schema as S } from "effect"
import * as HttpRequestsRateLimiter from "effect-requests-rate-limiter"
const DurationFromSecondsString = S.transform(
S.NumberFromString,
S.DurationFromMillis,
{
decode: (s) => s * 1000,
encode: (ms) => ms / 1000
}
)
const NonNegativeFromString = S.compose(S.NumberFromString, S.NonNegative)
const RateLimitHeadersSchema = HttpRequestsRateLimiter.makeHeadersSchema(S.Struct({
retryAfter: S.optional(DurationFromSecondsString).pipe(
S.fromKey("retry-after")
),
remainingRequestsQuota: S.optional(NonNegativeFromString).pipe(
S.fromKey("x-ratelimit-remaining")
),
resetAfter: S.optional(DurationFromSecondsString).pipe(
S.fromKey("x-ratelimit-reset")
)
}))
const myRetryPolicy = HttpRequestsRateLimiter.makeRetryPolicy(Effect.retry({
schedule: Schedule.jittered(Schedule.exponential("200 millis")),
while: (err) => err._tag === "ResponseError" && err.response.status === 429,
times: 5
}))
const EffectRateLimiter = RateLimiter.make({
limit: 5,
algorithm: "fixed-window",
interval: Duration.seconds(3)
})
const main = Effect.gen(function*() {
const rateLimiter = yield* EffectRateLimiter
const requestsRateLimiter = yield* HttpRequestsRateLimiter.make({
rateLimiterHeadersSchema: RateLimitHeadersSchema,
retryPolicy: myRetryPolicy,
effectRateLimiter: rateLimiter,
maxConcurrentRequests: 4
})
const req = HttpClientRequest.get("http://localhost:3000")
const response = yield* requestsRateLimiter.limit(req)
}).pipe(Effect.scoped)
NodeRuntime.runMain(main.pipe(
Effect.provide(Layer.merge(
NodeHttpClient.layer,
DevTools.layer()
))
))
Configuration Options
The HttpRequestsRateLimiter.make
function accepts the following configuration:
interface Config {
rateLimiterHeadersSchema?: HeadersSchema
retryPolicy?: RetryPolicy
effectRateLimiter?: RateLimiter.RateLimiter
maxConcurrentRequests?: number
}
Helper Functions
makeHeadersSchema(schema)
: Type-safe wrapper for creating header schemas
makeRetryPolicy(policy)
: Type-safe wrapper for creating retry policies
Rate Limiting Options
effectRateLimiter
: Effect RateLimiter with algorithms (fixed-window, sliding-window, token-bucket)
maxConcurrentRequests
: Simple concurrent request limit with semaphore
Typically configure one or the other: use effectRateLimiter
for time-based limits, maxConcurrentRequests
for simple concurrency.
The library uses a configurable schema to parse HTTP response headers into three standardized fields:
{
readonly "retryAfter"?: Duration.Duration | undefined
readonly "remainingRequestsQuota"?: number | undefined
readonly "resetAfter"?: Duration.Duration | undefined
}
All fields optional - without headers, only retry policy, Effect rate limiter, and concurrency limits apply.
retryAfter
: Wait time after 429 responses
remainingRequestsQuota
+ resetAfter
: Proactive quota management - gate closes when quota = 0
- Schema transforms API headers (
x-ratelimit-remaining
, retry-after
) to standardized fields
How It Works
- Non-2xx Error Enforcement: All non-2xx HTTP responses are automatically treated as Effect errors via
HttpClient.filterStatusOk
- Requests pass through the rate limiter gate
- Response headers parsed for rate limit info
- Gate closes on 429/quota exhaustion, reopens after delay
- Smart delays optimize concurrent request timing
- Failed requests retry per configured policy
Important Notes
⚠️ Non-2xx Response Handling: This library requires non-2xx HTTP responses to be treated as Effect errors for proper retry and gate control functionality. This is enforced internally using HttpClient.filterStatusOk
, so 4xx/5xx responses will automatically flow through Effect's error channel.
Peer Dependencies