Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Python 3 has some amazing support for async programming but it's arguably made it a bit harder to develop libraries. Are you tired of implementing synchronous and asynchronous methods doing basically the same thing? This might be a simple solution for you.
pip install synchronicity
Let's say you have an asynchronous function
async def f(x):
await asyncio.sleep(1.0)
return x**2
And let's say (for whatever reason) you want to offer a synchronous API to users. For instance maybe you want to make it easy to run your code in a basic script, or a user is building something that's mostly CPU-bound, so they don't want to bother with asyncio.
A "simple" way to create a synchronous equivalent would be to implement a set of synchronous functions where all they do is call asyncio.run on an asynchronous function. But this isn't a great solution for more complex code:
The last case is particularly challenging. For instance, let's say you are implementing a client to a database that needs to have a persistent connection, and you want to built it in asyncio:
class DBConnection:
def __init__(self, url):
self._url = url
async def connect(self):
self._connection = await connect_to_database(self._url)
async def query(self, q):
return await self._connection.run_query(q)
How do you expose a synchronous interface to this code? The problem is that wrapping connect
and query
in asyncio.run won't work since you need to preserve the event loop across calls. It's clear we need something slightly more advanced.
This library offers a simple Synchronizer
class that creates an event loop on a separate thread, and wraps functions/generators/classes so that synchronous execution happens on that thread. When you call anything, it will detect if you're running in a synchronous or asynchronous context, and behave correspondingly.
from synchronicity import Synchronizer
synchronizer = Synchronizer()
@synchronizer.create_blocking
async def f(x):
await asyncio.sleep(1.0)
return x**2
# Running f in a synchronous context blocks until the result is available
ret = f(42) # Blocks
print('f(42) =', ret)
async def g():
# Running f in an asynchronous context works the normal way
ret = await f(42)
print('f(42) =', ret)
The decorator also works on generators:
@synchronizer.create_blocking
async def f(n):
for i in range(n):
await asyncio.sleep(1.0)
yield i
# Note that the following runs in a synchronous context
# Each number will take 1s to print
for ret in f(10):
print(ret)
It also operates on classes by wrapping every method on the class:
@synchronizer.create_blocking
class DBConnection:
def __init__(self, url):
self._url = url
async def connect(self):
self._connection = await connect_to_database(self._url)
async def query(self, q):
return await self._connection.run_query(q)
# Now we can call it synchronously, if we want to
db_conn = DBConnection('tcp://localhost:1234')
db_conn.connect()
data = db_conn.query('select * from foo')
You can also make functions return a Future
object by adding _future=True
to any call. This can be useful if you want to dispatch many calls from a blocking context, but you want to resolve them roughly in parallel:
from synchronicity import Synchronizer
synchronizer = Synchronizer()
@synchronizer.create_blocking
async def f(x):
await asyncio.sleep(1.0)
return x**2
futures = [f(i, _future=True) for i in range(10)] # This returns immediately
rets = [fut.result() for fut in futures] # This should take ~1s to run, resolving all futures in parallel
print('first ten squares:', rets)
This library can also be useful in purely asynchronous settings, if you have multiple event loops, or if you have some section that is CPU-bound, or some critical code that you want to run on a separate thread for safety. All calls to synchronized functions/generators are thread-safe by design. This makes it a useful alternative to loop.run_in_executor for simple things. Note however that each synchronizer only runs one thread.
You can synchronize context manager classes just like any other class and the special methods will be handled properly.
There's also a function decorator @synchronizer.asynccontextmanager
which behaves just like https://docs.python.org/3/library/contextlib.html#contextlib.asynccontextmanager but works in both synchronous and asynchronous contexts.
This is code I broke out of a personal projects, and it's not been battle-tested. There is a small test suite that you can run using pytest.
Should automate this...
release-X.Y.Z
from mainX.Y.Z
git tag -a vX.Y.Z -m "* release bullets"
TWINE_USERNAME=__token__ TWINE_PASSWORD="$PYPI_TOKEN_SYNCHRONICITY" make publish
FAQs
Export blocking and async library versions from a single async implementation
We found that synchronicity demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.