Bottleneck
Job scheduler and rate limiter, supports Clustering
Install / Use
/learn @SGrondin/BottleneckREADME
bottleneck
[![Downloads][npm-downloads]][npm-url] [![version][npm-version]][npm-url] [![License][npm-license]][license-url]
Bottleneck is a lightweight and zero-dependency Task Scheduler and Rate Limiter for Node.js and the browser.
Bottleneck is an easy solution as it adds very little complexity to your code. It is battle-hardened, reliable and production-ready and used on a large scale in private companies and open source software.
It supports Clustering: it can rate limit jobs across multiple Node.js instances. It uses Redis and strictly atomic operations to stay reliable in the presence of unreliable clients and networks. It also supports Redis Cluster and Redis Sentinel.
<!-- toc -->- Install
- Quick Start
- Constructor
- Reservoir Intervals
submit()schedule()wrap()- Job Options
- Jobs Lifecycle
- Events
- Retries
updateSettings()incrementReservoir()currentReservoir()stop()chain()- Group
- Batching
- Clustering
- Debugging Your Application
- Upgrading To v2
- Contributing
Install
npm install --save bottleneck
import Bottleneck from "bottleneck";
// Note: To support older browsers and Node <6.0, you must import the ES5 bundle instead.
var Bottleneck = require("bottleneck/es5");
Quick Start
Step 1 of 3
Most APIs have a rate limit. For example, to execute 3 requests per second:
const limiter = new Bottleneck({
minTime: 333
});
If there's a chance some requests might take longer than 333ms and you want to prevent more than 1 request from running at a time, add maxConcurrent: 1:
const limiter = new Bottleneck({
maxConcurrent: 1,
minTime: 333
});
minTime and maxConcurrent are enough for the majority of use cases. They work well together to ensure a smooth rate of requests. If your use case requires executing requests in bursts or every time a quota resets, look into Reservoir Intervals.
Step 2 of 3
➤ Using promises?
Instead of this:
myFunction(arg1, arg2)
.then((result) => {
/* handle result */
});
Do this:
limiter.schedule(() => myFunction(arg1, arg2))
.then((result) => {
/* handle result */
});
Or this:
const wrapped = limiter.wrap(myFunction);
wrapped(arg1, arg2)
.then((result) => {
/* handle result */
});
➤ Using async/await?
Instead of this:
const result = await myFunction(arg1, arg2);
Do this:
const result = await limiter.schedule(() => myFunction(arg1, arg2));
Or this:
const wrapped = limiter.wrap(myFunction);
const result = await wrapped(arg1, arg2);
➤ Using callbacks?
Instead of this:
someAsyncCall(arg1, arg2, callback);
Do this:
limiter.submit(someAsyncCall, arg1, arg2, callback);
Step 3 of 3
Remember...
Bottleneck builds a queue of jobs and executes them as soon as possible. By default, the jobs will be executed in the order they were received.
Read the 'Gotchas' and you're good to go. Or keep reading to learn about all the fine tuning and advanced options available. If your rate limits need to be enforced across a cluster of computers, read the Clustering docs.
Need help debugging your application?
Instead of throttling maybe you want to batch up requests into fewer calls?
Gotchas & Common Mistakes
- Make sure the function you pass to
schedule()orwrap()only returns once all the work it does has completed.
Instead of this:
limiter.schedule(() => {
tasksArray.forEach(x => processTask(x));
// BAD, we return before our processTask() functions are finished processing!
});
Do this:
limiter.schedule(() => {
const allTasks = tasksArray.map(x => processTask(x));
// GOOD, we wait until all tasks are done.
return Promise.all(allTasks);
});
- If you're passing an object's method as a job, you'll probably need to
bind()the object:
// instead of this:
limiter.schedule(object.doSomething);
// do this:
limiter.schedule(object.doSomething.bind(object));
// or, wrap it in an arrow function instead:
limiter.schedule(() => object.doSomething());
-
Bottleneck requires Node 6+ to function. However, an ES5 build is included:
var Bottleneck = require("bottleneck/es5");. -
Make sure you're catching
"error"events emitted by your limiters! -
Consider setting a
maxConcurrentvalue instead of leaving itnull. This can help your application's performance, especially if you think the limiter's queue might become very long. -
If you plan on using
priorities, make sure to set amaxConcurrentvalue. -
When using
submit(), if a callback isn't necessary, you must passnullor an empty function instead. It will not work otherwise. -
When using
submit(), make sure all the jobs will eventually complete by calling their callback, or set anexpiration. Even if you submitted your job with anullcallback , it still needs to call its callback. This is particularly important if you are using amaxConcurrentvalue that isn'tnull(unlimited), otherwise those not completed jobs will be clogging up the limiter and no new jobs will be allowed to run. It's safe to call the callback more than once, subsequent calls are ignored. -
Using tools like
mockdatein your tests to change time in JavaScript will likely result in undefined behavior from Bottleneck.
Docs
Constructor
const limiter = new Bottleneck({/* options */});
Basic options:
| Option | Default | Description |
|--------|---------|-------------|
| maxConcurrent | null (unlimited) | How many jobs can be executing at the same time. Consider setting a value instead of leaving it null, it can help your application's performance, especially if you think the limiter's queue might get very long. |
| minTime | 0 ms | How long to wait after launching a job before launching another one. |
| highWater | null (unlimited) | How long can the queue be? When the queue length exceeds that value, the selected strategy is executed to shed the load. |
| strategy | Bottleneck.strategy.LEAK | Which strategy to use when the queue gets longer than the high water mark. Read about strategies. Strategies are never executed if highWater is null. |
| penalty | 15 * minTime, or 5000 when minTime is 0 | The penalty value used by the BLOCK strategy. |
| reservoir | null (unlimited) | How many jobs can be executed before the limiter stops executing jobs. If reservoir reaches 0, no jobs will be executed until it is no longer 0. New jobs will still be queued up. |
| reservoirRefreshInterval | null (disabled) | Every reservoirRefreshInterval milliseconds, the reservoir value will be automatically updated to the value of reservoirRefreshAmount. The reservoirRefreshInterval value should be a multiple of 250 (5000 for Clustering). |
| reservoirRefreshAmount | null (disabled) | The value to set reservoir to when reservoirRefreshInterval is in use. |
| reservoirIncreaseInterval | null (disabled) | Every reservoirIncreaseInterval milliseconds, the reservoir value will be automatically incremented by reservoirIncreaseAmount. The reservoirIncreaseInterval value should be a multiple of 250 (5000 for Clustering). |
| reservoirIncreaseAmount | null (disabled) | The increment applied to reservoir when reservoirIncreaseInterval is in use. |
| reservoirIncreaseMaximum | null (disabled) | The maximum value that reservoir can reach when reservoirIncreaseInterval is in use. |
| Promise | Promise (built-in) | This lets you override the Promise library used by Bottleneck. |
Reservoir Intervals
Reservoir Intervals let you execute requests in bursts, by automatically controlling the limiter's reservoir value. The reservoir is simply the number of jobs the limiter is allowed to execute. Once the value reaches 0, it stops starting new jobs.
There are 2 types of Reservoir Intervals: Refresh Intervals and Increase Intervals.
Refresh Interval
In this example, we throttle to 100 requests every 60 seconds:
const limiter = new Bottleneck({
reservoir: 100, // initial value
reservoirRefreshAmount: 100,
reservoirRefreshInterval: 60 * 1000, // must be divisible by 250
// also use maxConcurrent and/or minTime for safety
maxConcurrent: 1,
minTime: 333 // pick a value that makes sense for your use case
});
reservoir is a counter decremented every time a job is launched, we set its initial value to 100. Then, every reservoirRefreshInterval (60000 ms), reservoir is automatically updated to be equal to the reservoirRefreshAmount (100).
Increase Interval
In this example, we throttle jobs to meet the Shopify API Rate Limits. Users are allowed to send 40 requests initially, then every second grants 2 more requests up to a maximum of 40.
const limiter = new Bottleneck({
reservoir: 40, // initial value
reservoirIncreaseAmount: 2,
reservoirIncreaseInterval: 1000, // must be divisible by 250
reservoirIncreaseMaximum: 40,
// also use maxConcurrent and/or minTime for safety
maxConcurrent: 5,
minTime: 250 // pick a value that makes sense for your use case
});
Warnings
Reservoir Intervals are an advanced feature, please take the
