New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

asynctools

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

asynctools

Async tools for Python

  • 0.1.3
  • PyPI
  • Socket score

Maintainers
1

Build Status Pythons

AsyncTools

Async Tools for Python.

Table of Contents

Threading

Threading is the most simple thing, but because of GIL it's useless for computation. Only use when you want to parallelize the access to a blocking resource, e.g. network.

Async

Source: asynctools/threading/Async.py

Decorator for functions that should be run in a separate thread. When the function is called, it returns a threading.Event.

from asynctools.threading import Async

@Async
def request(url):
    # ... do request

request('http://example.com')  # Async request
request('http://example.com').wait()  # wait for it to complete

If you want to wait for multiple threads to complete, see next chapters.

Parallel

Source: asynctools/threading/Parallel.py

Execute functions in parallel and collect results. Each function is executed in its own thread, all threads exit immediately.

Methods:

  • __call__(*args, **kwargs): Add a job. Call the Parallel object so it calls the worker function with the same arguments

  • map(jobs): Convenience method to call the worker for every argument

  • first(timeout=None): Wait for a single result to be available, with an optional timeout in seconds. The result is returned as soon as it's ready. If all threads fail with an error -- None is returned.

  • join(): Wait for all tasks to be finished, and return two lists:

    • A list of results
    • A list of exceptions

Example:

from asynctools.threading import Parallel

def request(url):
    # ... do request
    return data

# Execute
pll = Parallel(request)
for url in links:
    pll(url)  # Starts a new thread


# Wait for the results
results, errors = pll.join()

Since the request method takes just one argument, this can be chained:

results, errors = Parallel(request).map(links).join()

Pool

Source: asynctools/threading/Pool.py

Create a pool of threads and execute work in it. Useful if you do want to launch a limited number of long-living threads.

Methods are same with Parallel, with some additions:

  • __call__(*args, **kwargs)
  • map(jobs)
  • first(timeout=None)
  • close(): Terminate all threads. The pool is no more usable when closed.
  • __enter__, __exit__ context manager to be used with with statement

Example:

from asynctools.threading import Pool

def request(url):
    # ... do long request
    return data

# Make pool
pool = Pool(request, 5)

# Assign some job
for url in links:
    pll(url)  # Runs in a pool

# Wait for the results
results, errors = pll.join()

Keywords

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc