Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

async-task-schedule

Package Overview
Dependencies
Maintainers
1
Versions
8
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

async-task-schedule

schedule async tasks

  • 1.0.0
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

async-task-schedule

github actions use it with confident code with typescript npm version npm downloads

schedule async tasks in order

Features

  • remove duplicated tasks' requests
  • combine tasks' requests in same time and do all together
  • can prevent massive requests at same time, make them execute one group by one group
  • cache result and can specify validity

Install

yarn add async-task-schedule
# or with npm
npm install async-task-schedule -S

Usage

import TaskSchedule from 'async-task-schedule'
let count = 0
const taskSchedule = new TaskSchedule({
    doTask: async (name: string) => {
        count += 1
      return `${name}${count}`
    },
    // or use this will do the same
    // batchDoTasks: async (names: string[]) => {
    //   count += 1
    //   return names.map((n) => `${n}${count}`)
    // },
})

taskSchedule.dispatch(['a', 'b']).then(console.log)
taskSchedule.dispatch(['b', 'c']).then(console.log)
taskSchedule.dispatch(['d', 'c']).then(console.log)
taskSchedule.dispatch('c').then(console.log)
// batchDoTasks will be only called once

NOTICE: in following example, tasks won't combine

// batchDoTasks will be executed 3 times due to javascript language features
const result1 = await taskSchedule.dispatch(['a', 'b'])
const result2 = await taskSchedule.dispatch(['b', 'c'])
const result3 = await taskSchedule.dispatch(['d', 'c'])
const result4 = await taskSchedule.dispatch('c')

API

constructor(options: ITaskScheduleOptions)

options define:

// `Task` for single task's parameters
// `Result` for single task's response
interface ITaskScheduleOptions<Task, Result> {
  /**
   * action to do batch tasks, can be async or sync function
   *  Task: single task request info
   *  Result: single task success response
   * 
   * batchDoTasks should receive multitasks, and return result or error in order
   * one of batchDoTasks/doTask must be specified, batchDoTasks will take priority
   */
  batchDoTasks: (tasks: Task[]) => Promise<Array<Result | Error>> | Array<Result | Error>

  /**
   * action to do single task, can be async or sync function
   *  one of batchDoTasks/doTask must be specified, batchDoTasks will take priority
   */
  doTask?: (task: Task) => Promise<Result> | Result

  /**
   * check whether two tasks are equal
   *  it helps to avoid duplicated tasks
   *  default: AsyncTask.isEqual (deep equal)
   */
  isSameTask?: (a: Task, b: Task) => boolean

  /**
   * max task count for batchDoTasks, default unlimited
   *  undefined or 0 for unlimited
   */
  maxBatchCount?: number

  /**
   * batch tasks executing strategy, default parallel
   *  only works if maxBatchCount is specified and tasks more than maxBatchCount are executed
   *  
   * parallel: split all tasks into a list stride by maxBatchCount, exec them at the same time
   * serial: split all tasks into a list stride by maxBatchCount, exec theme one group by one group
   *    if serial specified, when tasks are executing, new comings will wait for them to complete
   *    it's especially useful to cool down task requests
   */
  taskExecStrategy: 'parallel' | 'serial'

  /**
   * task waiting stragy, default to debounce
   *  throttle: tasks will combined and dispatch every `maxWaitingGap`
   *  debounce: tasks will combined and dispatch util no more tasks in next `maxWaitingGap`
   */
  taskWaitingStrategy: 'throttle' | 'debounce'

  /**
   * task waiting time in milliseconds, default 50ms
   *     differently according to taskWaitingStrategy
   */
  maxWaitingGap: number


  /**
   * task result caching duration(in milliseconds), default to 1000ms (1s)
   * > `undefined` or `0` for unlimited  
   * > set to minimum value `1` to disable caching  
   * > `function` to specified specified each task's validity
   * 
   * *cache is lazy cleaned after invalid*
   */
  invalidAfter?: number | ((cached: readonly [Task, Result | Error]) => number)

  /**
   * retry failed tasks next time after failing, default true
   */
  retryWhenFailed?: boolean
}

example:

import TaskSchedule from 'async-task-schedule'

const taskSchedule = new TaskSchedule({
  doTask(n) { 
    console.log(`do task with ${n}`)
    return n * n
  },
  invalidAfter: 0
})

const result = await taskSchedule.dispatch([1,2,3,1,2])
// get first result
const resultOf1 = result[0] // 1
// doTask won't be called
const result11 = await taskSchedule.dispatch(1) // 1

// clean all cached result
taskSchedule.cleanCache()
// doTask will be call again
const result12 = await taskSchedule.dispatch(1) // 1

dispatch(tasks: Task[]):Promise<Array<Result | Error>>

dispatch multitasks at a time, will get response with corresponding order of tasks this method won't throw any error, it will fulfil even partially failed, you can check whether its success by response instanceof Error

import TaskSchedule from 'async-task-schedule'

const taskSchedule = new TaskSchedule({
  doTask(n) { 
    console.log(`do task with ${n}`)
    if (n % 2) throw new Error(`${n} is unsupported`)
    return n * n
  },
  invalidAfter: 0
})

const result = await taskSchedule.dispatch([1,2,3,1,2])
// get first result
const resultOf1 = result[0] // 1
// second result is error
const isError = result[1] instanceof Error // error object


try {
  // will throw an error
  const result2 = await taskSchedule.dispatch(2) // 1
} catch(error) {
  console.warn(error)
}

dispatch(tasks: Task):Promise

dispatch a task, will get response if success, or throw an error

import TaskSchedule from 'async-task-schedule'

const taskSchedule = new TaskSchedule({
  doTask(n) { 
    console.log(`do task with ${n}`)
    if (n % 2) throw new Error(`${n} is unsupported`)
    return n * n
  },
  invalidAfter: 0
})

const result1 = await taskSchedule.dispatch(1) // 1
try {
  const result2 = await taskSchedule.dispatch(2),
} catch(error) {
  console.warn(error)
}

cleanCache

clean cached tasks' result, so older task will trigger new request and get fresh response.

attention: this action may not exec immediately, it will take effect after all tasks are done

import TaskSchedule from 'async-task-schedule'

const taskSchedule = new TaskSchedule({
  doTask(n) { 
    console.log(`do task with ${n}`)
    return n * n
  },
  invalidAfter: 0
})

await Promise.all([
  taskSchedule.dispatch([1, 2, 3, 1, 2]),
  taskSchedule.dispatch([1, 9, 10, 12, 22]),
])
// clean all cached result
taskSchedule.cleanCache()
// task will execute again
const result = await taskSchedule.dispatch(1),

utils methods

there are some utils method as static members of async-task-schedule

chunk(arr: T[], size: number): T[][]

split array to chunks with specified size

import TaskSchedule from 'async-task-schedule'

const chunked = TaskSchedule.chunk([1,2,3,4,5,6,7], 3)
// [[1,2,3], [4,5,6], [7]]

isEqual(a: unknown, b: unknown): boolean

check whether the given values are equal (with deep comparison)

import TaskSchedule from 'async-task-schedule'

TaskSchedule.isEqual(1, '1') // false
TaskSchedule.isEqual('1', '1') // true
TaskSchedule.isEqual(NaN, NaN) // true
TaskSchedule.isEqual({a: 'a', b: 'b'}, {b: 'b', a: 'a'}) // true
TaskSchedule.isEqual({a: 'a', b: 'b', c: {e: [1,2,3]}}, {b: 'b', c: {e: [1,2,3]}, a: 'a'}) // true
TaskSchedule.isEqual({a: 'a', b: /acx/}, {b: new RegExp('acx'), a: 'a'}) // true

you can use it to check whether two tasks are equal / find specified task

Receipts

how to integrate with existing code

what you need to do is to wrap your existing task executing function into a new batchDoTasks

example 1: cache fetch

suppose we use browser native fetch to send request, we can do so to make an improvement:


const fetchSchedule = new TaskSchedule({
  async doTask(cfg: {resource: string, options?: RequestInit}) {
    return await fetch(cfg.resource, cfg.options)
  },
  // 0 for forever
  // set a minimum number 1 can disable cache after 1 millisecond
  invalidAfter([cfg, result]) {
    // cache get request for 3s
    if (!cfg.options || !cfg.options.method || !cfg.options.method.toLowerCase() === 'get') {
      // cache sys static config forever
      if (/\/sys\/static-config$/.test(cfg.resource)) return 0
      return 3000
    }
    // disable other types request
    return 1
  }
})

const betterFetch = (resource: string, options?: RequestInit) => {
  return fetchSchedule.dispatch({resource, options})
}

// than you can replace fetch with betterFetch

with those codes above:

  1. you can remove redundant request(requests with same parameters at same time will be reduced to one, this may have some side effects)
  2. get request can be cached in a short time
example 2: deal with getUsers

suppose we have a method getUsers defined as follows:

getUsers(userIds: string[]) => Promise<[{code: string, message: string, id?: string, name?: string, email?: string}]>

then we can implement a batch version:

async function batchGetUsers(userIds: string[]): Promise<Array<[string, {id: string, name: string, email: string}]>> {
  // there is no need to try/catch, errors will be handled properly
  const users = await getUsers(userIds)
  // convert invalid users to error
  return users.map(user => (user.code === 'failed' ? new Error(user.message) : user))
}

const getUserSchedule = new TaskSchedule({
  batchDoTasks: batchGetUsers,
  // cache user info forever
  invalidAfter: 0,
})

const result = await Promise.all([
  getUserSchedule.dispatch(['user1', 'user2']),
  getUserSchedule.dispatch(['user3', 'user2'])
])
// only one request will be sent via getUsers with userIds ['user1', 'user2', 'user3']

// request combine won't works when using await separately
const result1 = await getUserSchedule.dispatch(['user1', 'user2'])
const result2 = await getUserSchedule.dispatch(['user3', 'user2'])

If you got a batch version function, you just need to make sure it throw an error when error occurred.

how to cool down massive requests at the same time

by setting taskExecStrategy to serial and using smaller maxBatchCount(you can even set it to 1), you can achieve this easily

const taskSchedule = new TaskSchedule({
  ...,
  taskExecStrategy: 'serial',
  maxBatchCount: 2,
})

Keywords

FAQs

Package last updated on 03 Nov 2022

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc