Skip to content

jdalrymple/sema4

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

sema4

pipeline status coverage report Code Climate maintainability Auto All Contributors Prettier Licence: MIT

A semaphore implementation using promises. Forked from vercel/async-sema.

Table of Contents

Features

  • Universal - Works in all modern browsers, Node.js, and Deno and supports CLI usage.
  • Zero Dependencies - Absolutely no dependencies, keeping the package tiny (24kb).
  • Tested - Greater than 85% test coverage.
  • Typed - Out of the box TypeScript declarations.

Usage

Browsers Load sema4 directly from esm.sh
<script type="module">
  import { Sema } from 'https://esm.sh/sema4';
</script>
Deno Load sema4 directly from esm.sh
import { Sema } from 'https://esm.sh/sema4?dts';
Node 18+

Install with npm install sema4, or yarn add sema4

import { Sema } from 'sema4';

API

Sema

Constructor(maxConcurrency, { initFn, pauseFn, resumeFn, capacity })

Name Type Optional Default Description
maxConcurrency Integer No https://gitlab.com The maximum number of callers allowed to acquire the semaphore concurrently
options.initFn Function Yes () => '1' The function that is used to initialize the tokens used to manage the semaphore
options.pauseFn Function Yes* The function that is called to opportunistically request pausing the incoming stream of data, instead of piling up waiting promises and possibly running out of memory
options.resumeFn Function Yes* N/A The function that is called when there is room again to accept new waiters on the semaphore. This function must be declared if a pauseFn is declared
options.capacity Integer Yes 10 Sets the size of the pre-allocated waiting list inside the semaphore. This is typically used by high performance where the developer can make a rough estimate of the number of concurrent users of a semaphore.

async sema.drain()

Drains the semaphore and returns all the initialized tokens in an array. Draining is an ideal way to ensure there are no pending async tasks, for example before a process will terminate.

sema.waiting()

Returns the number of callers waiting on the semaphore, i.e. the number of pending promises.

sema.tryAcquire()

Attempt to acquire a token from the semaphore, if one is available immediately. Otherwise, return undefined.

async sema.acquire()

Acquire a token from the semaphore, thus decrement the number of available execution slots. If initFn is not used then the return value of the function can be discarded.

sema.release(token)

Release the semaphore, thus increment the number of free execution slots. If initFn is used then the token returned by acquire() should be given as an argument when calling this function.

createRateLimiter(rptu, { timeUnit, uniformDistribution })

Creates a rate limiter function that blocks with a promise whenever the rate limit is hit and resolves the promise once the call rate is within the limit.

Name Type Optional Default Description
rptu Integer No Number of tasks allowed per timeUnit
options.timeUnit Integer Yes 1000 Defines the width of the rate limiting window in milliseconds
options.uniformDistribution Boolean Yes False Enforces a discrete uniform distribution over time. Setting the uniformDistribution option is mainly useful in a situation where the flow of rate limit function calls is continuous and and occurring faster than timeUnit (e.g. reading a file) and not enabling it would cause the maximum number of calls to resolve immediately (thus exhaust the limit immediately) and therefore the next bunch of calls would need to wait for timeUnit milliseconds. However if the flow is sparse then this option may make the code run slower with no advantages.

Examples

import { Sema } from 'sema4';

function foo() {
  const s = new Sema(
    4, // Allow 4 concurrent async calls
    {
      capacity: 100, // Preallocated space for 100 tokens
    },
  );

  async function fetchData(x) {
    await s.acquire();

    try {
      console.log(s.waiting() + ' calls to fetch are waiting');
      // Perform some async tasks here...
    } finally {
      s.release();
    }
  }

  return Promise.all(array.map(fetchData));
}
import { RateLimit } from 'sema4';

async function bar() {
  const lim = RateLimit(5); // Limit to 5 tasks per default timeUnit

  for (let i = 0; i < n; i++) {
    await lim();
    // Perform some async tasks here...
  }
}

Contributors

In addition to the contributors of the parent repository vercel/async-sema, these lovely people have helped keep this library going.

Justin Dalrymple