-
Notifications
You must be signed in to change notification settings - Fork 1
Description
Prerequisites
- I have searched the existing issues to avoid duplicates
- I understand that this is just a suggestion and might not be implemented
Problem Statement
Our Express.js API handles async operations and outgoing requests (e.g., to external APIs or databases), but lacks fine-grained control over execution rates and concurrency. This can lead to overwhelming downstream services, hitting external rate limits, or inefficient resource use during bursts, causing failures or degraded performance.
Proposed Solution
Integrate the bottleneck library to wrap async handlers and jobs with configurable throttling. This enables setting minTime between executions, maxConcurrent limits, and bursting reservoirs for controlled workloads. Key steps:
- Install via
npm install bottleneck. - Configure a shared limiter (e.g., with Redis for clustering):
new Bottleneck({ minTime: 200, maxConcurrent: 5, reservoir: 10, datastore: 'redis' }). - Wrap routes/jobs:
app.get('/api/external', limiter.wrap(asyncHandler)). - Support priorities and queue strategies (e.g., OVERFLOW_PRIORITY for dropping low-priority jobs).
This complements existing rate limiting by focusing on outgoing/internal job control, ensuring balanced load distribution.
Alternatives Considered
- Manual throttling: Custom promises/delays in handlers—simple but error-prone and hard to scale in clusters.
- Existing express-rate-limit: Good for incoming requests but doesn't manage outgoing concurrency or queuing.
- overload-protection: Excellent for incoming load shedding but doesn't handle job scheduling or downstream protection.
Additional Context
Reference: bottleneck GitHub repo.
This enhances resilience for async-heavy endpoints. Tune limits based on external API docs and load tests (e.g., with Artillery). Could integrate with RedisStore from current rate-limiter for shared state.
Priority
Critical