New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@teamawesome/tiny-batch

Package Overview
Dependencies
Maintainers
1
Versions
11
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@teamawesome/tiny-batch - npm Package Compare versions

Comparing version 1.0.2 to 1.0.3

17

package.json
{
"name": "@teamawesome/tiny-batch",
"version": "1.0.2",
"version": "1.0.3",
"description": "Combine individual calls to a single one.",

@@ -9,10 +9,11 @@ "keywords": [

"combine",
"callback"
"callback",
"dataloader"
],
"source": "src/index.ts",
"main": "dist/tiny-batch.js",
"exports": "dist/tiny-batch.modern.js",
"module": "dist/tiny-batch.module.js",
"unpkg": "dist/tiny-batch.umd.js",
"types": "dist/index.d.ts",
"source": "./src/index.ts",
"main": "./dist/tiny-batch.js",
"exports": "./dist/tiny-batch.modern.js",
"module": "./dist/tiny-batch.module.js",
"unpkg": "./dist/tiny-batch.umd.js",
"types": "./dist/index.d.ts",
"scripts": {

@@ -19,0 +20,0 @@ "build": "microbundle --no-compress",

@@ -5,2 +5,6 @@ # Installation

```
# tiny-batch
tiny-batch is a utility to create functions of which the execution is batched. This can be very useful for instance to
limit the amount of queries or http requests while still having a single, easy to use function.
# Usage

@@ -21,4 +25,4 @@ Call `tinybatch` to create an async function that adds to the batch. The first argument is a callback that will handle the batching.

// batchedArgs equals [[1], [2]]
const userIds = batchedArgs.flat();
return fetch(`api/${userIds}`)

@@ -94,2 +98,21 @@ .then(response => response.json())

# Caching
To reduce overhead even more, caching can be introduced. While this is not supported directly by tiny-batch, it is very
simple to achieve. Use any of the memoization libraries available. For example, [memoizee](https://www.npmjs.com/package/memoizee);
```ts
import memoizee from 'memoizee';
const batched = tinybatch((args) => {
// code
});
const batchedAndCached = memoizee(batched, {
// Set the amount of arguments that "batchedAndCached" will receive.
length: 1
});
await batchedAndCached('once');
await batchedAndCached('once');
```
The second call is not added to the queue but will resolve with the same value.
# Types

@@ -136,3 +159,2 @@ ```ts

}
```

@@ -19,7 +19,7 @@ import {AddToBatch, ExecuteBatch, Scheduler} from "./types";

const fn: AddToBatch<Result, Args> = (...args: Args) => {
return new Promise<Result>(((resolve, reject) => {
return new Promise<Result>((resolve, reject) => {
queue.add(args, resolve, reject);
scheduler(queue.args, fn.flush);
}));
});
};

@@ -26,0 +26,0 @@

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc