New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

python-socketio

Package Overview
Dependencies
Maintainers
1
Versions
107
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

python-socketio - pypi Package Compare versions

Comparing version
5.16.0
to
5.16.1
+11
docs/api_client.rst
Clients
-------
.. autoclass:: socketio.Client
:members:
:inherited-members:
.. autoclass:: socketio.AsyncClient
:members:
:inherited-members:
Managers
--------
.. autoclass:: socketio.Manager
:members:
:inherited-members:
.. autoclass:: socketio.PubSubManager
:members:
:inherited-members:
.. autoclass:: socketio.KombuManager
:members:
:inherited-members:
.. autoclass:: socketio.RedisManager
:members:
:inherited-members:
.. autoclass:: socketio.KafkaManager
:members:
:inherited-members:
.. autoclass:: socketio.ZmqManager
:members:
:inherited-members:
.. autoclass:: socketio.AsyncManager
:members:
:inherited-members:
.. autoclass:: socketio.AsyncRedisManager
:members:
:inherited-members:
.. autoclass:: socketio.AsyncAioPikaManager
:members:
:inherited-members:
Middlewares
-----------
.. autoclass:: socketio.WSGIApp
:members:
.. autoclass:: socketio.ASGIApp
:members:
.. autoclass:: socketio.Middleware
:members:
Namespaces
----------
.. autoclass:: socketio.ClientNamespace
:members:
:inherited-members:
.. autoclass:: socketio.Namespace
:members:
:inherited-members:
.. autoclass:: socketio.AsyncClientNamespace
:members:
:inherited-members:
.. autoclass:: socketio.AsyncNamespace
:members:
:inherited-members:
Servers
-------
.. autoclass:: socketio.Server
:members:
:inherited-members:
.. autoclass:: socketio.AsyncServer
:members:
:inherited-members:
.. autoclass:: socketio.exceptions.ConnectionRefusedError
:members:
Simple Clients
--------------
.. autoclass:: socketio.SimpleClient
:members:
:inherited-members:
.. autoclass:: socketio.AsyncSimpleClient
:members:
:inherited-members:
+7
-86

@@ -5,88 +5,9 @@ API Reference

.. toctree::
:maxdepth: 3
:maxdepth: 2
.. module:: socketio
.. autoclass:: SimpleClient
:members:
:inherited-members:
.. autoclass:: AsyncSimpleClient
:members:
:inherited-members:
.. autoclass:: Client
:members:
:inherited-members:
.. autoclass:: AsyncClient
:members:
:inherited-members:
.. autoclass:: Server
:members:
:inherited-members:
.. autoclass:: AsyncServer
:members:
:inherited-members:
.. autoclass:: socketio.exceptions.ConnectionRefusedError
:members:
.. autoclass:: WSGIApp
:members:
.. autoclass:: ASGIApp
:members:
.. autoclass:: Middleware
:members:
.. autoclass:: ClientNamespace
:members:
:inherited-members:
.. autoclass:: Namespace
:members:
:inherited-members:
.. autoclass:: AsyncClientNamespace
:members:
:inherited-members:
.. autoclass:: AsyncNamespace
:members:
:inherited-members:
.. autoclass:: Manager
:members:
:inherited-members:
.. autoclass:: PubSubManager
:members:
:inherited-members:
.. autoclass:: KombuManager
:members:
:inherited-members:
.. autoclass:: RedisManager
:members:
:inherited-members:
.. autoclass:: KafkaManager
:members:
:inherited-members:
.. autoclass:: AsyncManager
:members:
:inherited-members:
.. autoclass:: AsyncRedisManager
:members:
:inherited-members:
.. autoclass:: AsyncAioPikaManager
:members:
:inherited-members:
api_simpleclient
api_client
api_server
api_middleware
api_namespace
api_manager

@@ -283,3 +283,3 @@ The Socket.IO Clients

Similarily, a "catch-all" namespace handler is invoked for any connected
Similarly, a "catch-all" namespace handler is invoked for any connected
namespaces that do not have an explicitly defined event handler. As with

@@ -286,0 +286,0 @@ catch-all events, ``'*'`` is used in place of a namespace::

@@ -80,3 +80,4 @@ # -*- coding: utf-8 -*-

#
html_theme = 'alabaster'
html_theme = 'furo'
html_title = 'python-socketio'

@@ -88,9 +89,2 @@ # Theme options are theme-specific and customize the look and feel of a theme

html_theme_options = {
'github_user': 'miguelgrinberg',
'github_repo': 'python-socketio',
'github_banner': True,
'github_button': True,
'github_type': 'star',
'fixed_sidebar': True,
}

@@ -97,0 +91,0 @@

@@ -13,3 +13,3 @@ .. python-socketio documentation master file, created by

.. toctree::
:maxdepth: 3
:maxdepth: 2

@@ -16,0 +16,0 @@ intro

@@ -495,3 +495,3 @@ The Socket.IO Server

Similarily to catch-all event handlers, a "catch-all" namespace can be used
Similarly to catch-all event handlers, a "catch-all" namespace can be used
when defining event handlers for any connected namespaces that do not have an

@@ -498,0 +498,0 @@ explicitly defined event handler. As with catch-all events, ``'*'`` is used in

Metadata-Version: 2.4
Name: python-socketio
Version: 5.16.0
Version: 5.16.1
Summary: Socket.IO server and client for Python

@@ -27,2 +27,3 @@ Author-email: Miguel Grinberg <miguel.grinberg@gmail.com>

Requires-Dist: sphinx; extra == "docs"
Requires-Dist: furo; extra == "docs"
Dynamic: license-file

@@ -29,0 +30,0 @@

[project]
name = "python-socketio"
version = "5.16.0"
version = "5.16.1"
license = {text = "MIT"}

@@ -42,2 +42,3 @@ authors = [

"sphinx",
"furo",
]

@@ -44,0 +45,0 @@

Metadata-Version: 2.4
Name: python-socketio
Version: 5.16.0
Version: 5.16.1
Summary: Socket.IO server and client for Python

@@ -27,2 +27,3 @@ Author-email: Miguel Grinberg <miguel.grinberg@gmail.com>

Requires-Dist: sphinx; extra == "docs"
Requires-Dist: furo; extra == "docs"
Dynamic: license-file

@@ -29,0 +30,0 @@

@@ -16,1 +16,2 @@ bidict>=0.21.0

sphinx
furo

@@ -8,2 +8,8 @@ LICENSE

docs/api.rst
docs/api_client.rst
docs/api_manager.rst
docs/api_middleware.rst
docs/api_namespace.rst
docs/api_server.rst
docs/api_simpleclient.rst
docs/client.rst

@@ -10,0 +16,0 @@ docs/conf.py

@@ -197,2 +197,4 @@ from datetime import datetime, timezone

self.stats_task.join()
self.stop_stats_event.clear()
self.stats_task = None

@@ -210,2 +212,5 @@ def _trigger_event(self, event, namespace, *args):

), namespace=self.admin_namespace)
if not self.sio.eio._get_socket(eio_sid).upgraded:
self.sio.start_background_task(
self._check_for_upgrade, eio_sid, sid, namespace)
elif event == 'disconnect':

@@ -288,2 +293,3 @@ del self.sio.manager._timestamps[sid]

self.stop_stats_event = self.sio.eio.create_event()
if self.stats_task is None:
self.stats_task = self.sio.start_background_task(

@@ -290,0 +296,0 @@ self._emit_server_stats)

@@ -160,5 +160,2 @@ import asyncio

self.sio.start_background_task(config, sid)
self.stop_stats_event = self.sio.eio.create_event()
self.stats_task = self.sio.start_background_task(
self._emit_server_stats)

@@ -187,2 +184,4 @@ async def admin_emit(self, _, namespace, room_filter, event, *data):

await asyncio.gather(self.stats_task)
self.stats_task = None
self.stop_stats_event.clear()

@@ -200,2 +199,5 @@ async def _trigger_event(self, event, namespace, *args):

), namespace=self.admin_namespace)
if not self.sio.eio._get_socket(eio_sid).upgraded:
self.sio.start_background_task(
self._check_for_upgrade, eio_sid, sid, namespace)
elif event == 'disconnect':

@@ -279,2 +281,3 @@ del self.sio.manager._timestamps[sid]

self.stop_stats_event = self.sio.eio.create_event()
if self.stats_task is None:
self.stats_task = self.sio.start_background_task(

@@ -281,0 +284,0 @@ self._emit_server_stats)

import asyncio
from engineio import json
from .async_pubsub_manager import AsyncPubSubManager

@@ -35,3 +34,13 @@

default of ``False`` initializes the class for emitting
and receiving.
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
"""

@@ -42,3 +51,3 @@

def __init__(self, url='amqp://guest:guest@localhost:5672//',
channel='socketio', write_only=False, logger=None):
channel='socketio', write_only=False, logger=None, json=None):
if aio_pika is None:

@@ -48,3 +57,4 @@ raise RuntimeError('aio_pika package is not installed '

'virtualenv).')
super().__init__(channel=channel, write_only=write_only, logger=logger)
super().__init__(channel=channel, write_only=write_only, logger=logger,
json=json)
self.url = url

@@ -88,3 +98,3 @@ self._lock = asyncio.Lock()

aio_pika.Message(
body=json.dumps(data).encode(),
body=self.json.dumps(data).encode(),
delivery_mode=aio_pika.DeliveryMode.PERSISTENT

@@ -91,0 +101,0 @@ ), routing_key='*',

@@ -40,6 +40,7 @@ import asyncio

``logger`` is ``False``.
:param json: An alternative json module to use for encoding and decoding
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions.
versions. This is a process-wide setting, all instantiated
servers and clients must use the same JSON module.
:param handle_sigint: Set to ``True`` to automatically handle disconnection

@@ -46,0 +47,0 @@ when the process is interrupted, or to ``False`` to

@@ -6,4 +6,2 @@ import asyncio

from engineio import json
from .async_manager import AsyncManager

@@ -26,6 +24,20 @@ from .packet import Packet

notifications.
:param write_only: If set to ``True``, only initialize to emit events. The
default of ``False`` initializes the class for emitting
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
"""
name = 'asyncpubsub'
def __init__(self, channel='socketio', write_only=False, logger=None):
def __init__(self, channel='socketio', write_only=False, logger=None,
json=None):
super().__init__()

@@ -36,2 +48,4 @@ self.channel = channel

self.logger = logger
if json is not None:
self.json = json

@@ -227,3 +241,3 @@ def initialize(self):

try:
data = json.loads(message)
data = self.json.loads(message)
except:

@@ -230,0 +244,0 @@ pass

@@ -22,3 +22,2 @@ import asyncio

from engineio import json
from .async_pubsub_manager import AsyncPubSubManager

@@ -51,3 +50,13 @@ from .redis_manager import parse_redis_sentinel_url

default of ``False`` initializes the class for emitting
and receiving.
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
:param redis_options: additional keyword arguments to be passed to

@@ -59,7 +68,8 @@ ``Redis.from_url()`` or ``Sentinel()``.

def __init__(self, url='redis://localhost:6379/0', channel='socketio',
write_only=False, logger=None, redis_options=None):
write_only=False, logger=None, json=None, redis_options=None):
if aioredis and \
not hasattr(aioredis.Redis, 'from_url'): # pragma: no cover
raise RuntimeError('Version 2 of aioredis package is required.')
super().__init__(channel=channel, write_only=write_only, logger=logger)
super().__init__(channel=channel, write_only=write_only, logger=logger,
json=json)
self.redis_url = url

@@ -123,3 +133,3 @@ self.redis_options = redis_options or {}

return await self.redis.publish(
self.channel, json.dumps(data))
self.channel, self.json.dumps(data))
except error as exc:

@@ -126,0 +136,0 @@ if retries_left > 0:

@@ -31,6 +31,7 @@ import asyncio

errors are logged even when ``logger`` is ``False``.
:param json: An alternative json module to use for encoding and decoding
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions.
versions. This is a process-wide setting, all instantiated
servers and clients must use the same JSON module.
:param async_handlers: If set to ``True``, event handlers for a client are

@@ -37,0 +38,0 @@ executed in separate threads. To run handlers for a

import itertools
import logging
import json

@@ -17,5 +18,7 @@ from bidict import bidict, ValueDuplicationError

self.pending_disconnect = {}
self.json = json
def set_server(self, server):
self.server = server
self.json = self.server.packet_class.json # use the global JSON module

@@ -22,0 +25,0 @@ def initialize(self):

@@ -42,6 +42,7 @@ import random

serializers.
:param json: An alternative json module to use for encoding and decoding
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions.
versions. This is a process-wide setting, all instantiated
servers and clients must use the same JSON module.
:param handle_sigint: Set to ``True`` to automatically handle disconnection

@@ -48,0 +49,0 @@ when the process is interrupted, or to ``False`` to

@@ -8,3 +8,2 @@ import logging

from engineio import json
from .pubsub_manager import PubSubManager

@@ -36,3 +35,13 @@

default of ``False`` initializes the class for emitting
and receiving.
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
"""

@@ -42,3 +51,3 @@ name = 'kafka'

def __init__(self, url='kafka://localhost:9092', channel='socketio',
write_only=False):
write_only=False, logger=None, json=None):
if kafka is None:

@@ -49,3 +58,4 @@ raise RuntimeError('kafka-python package is not installed '

super().__init__(channel=channel, write_only=write_only)
super().__init__(channel=channel, write_only=write_only, logger=logger,
json=json)

@@ -60,3 +70,3 @@ urls = [url] if isinstance(url, str) else url

def _publish(self, data):
self.producer.send(self.channel, value=json.dumps(data))
self.producer.send(self.channel, value=self.json.dumps(data))
self.producer.flush()

@@ -63,0 +73,0 @@

@@ -9,3 +9,2 @@ import time

from engineio import json
from .pubsub_manager import PubSubManager

@@ -38,3 +37,13 @@

default of ``False`` initializes the class for emitting
and receiving.
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
:param connection_options: additional keyword arguments to be passed to

@@ -52,3 +61,3 @@ ``kombu.Connection()``.

def __init__(self, url='amqp://guest:guest@localhost:5672//',
channel='socketio', write_only=False, logger=None,
channel='socketio', write_only=False, logger=None, json=None,
connection_options=None, exchange_options=None,

@@ -60,3 +69,4 @@ queue_options=None, producer_options=None):

'virtualenv).')
super().__init__(channel=channel, write_only=write_only, logger=logger)
super().__init__(channel=channel, write_only=write_only, logger=logger,
json=json)
self.url = url

@@ -109,3 +119,3 @@ self.connection_options = connection_options or {}

self.publisher_connection)
producer_publish(json.dumps(data))
producer_publish(self.json.dumps(data))
break

@@ -112,0 +122,0 @@ except (OSError, kombu.exceptions.KombuError):

@@ -5,4 +5,2 @@ import base64

from engineio import json
from .manager import Manager

@@ -25,6 +23,20 @@ from .packet import Packet

notifications.
:param write_only: If set to ``True``, only initialize to emit events. The
default of ``False`` initializes the class for emitting
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
"""
name = 'pubsub'
def __init__(self, channel='socketio', write_only=False, logger=None):
def __init__(self, channel='socketio', write_only=False, logger=None,
json=None):
super().__init__()

@@ -35,2 +47,4 @@ self.channel = channel

self.logger = logger
if json is not None:
self.json = json

@@ -221,3 +235,3 @@ def initialize(self):

try:
data = json.loads(message)
data = self.json.loads(message)
except:

@@ -224,0 +238,0 @@ pass

@@ -19,3 +19,2 @@ import logging

from engineio import json
from .pubsub_manager import PubSubManager

@@ -76,3 +75,13 @@

default of ``False`` initializes the class for emitting
and receiving.
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.
:param redis_options: additional keyword arguments to be passed to

@@ -84,4 +93,5 @@ ``Redis.from_url()`` or ``Sentinel()``.

def __init__(self, url='redis://localhost:6379/0', channel='socketio',
write_only=False, logger=None, redis_options=None):
super().__init__(channel=channel, write_only=write_only, logger=logger)
write_only=False, logger=None, json=None, redis_options=None):
super().__init__(channel=channel, write_only=write_only, logger=logger,
json=json)
self.redis_url = url

@@ -159,3 +169,3 @@ self.redis_options = redis_options or {}

self._redis_connect()
return self.redis.publish(self.channel, json.dumps(data))
return self.redis.publish(self.channel, self.json.dumps(data))
except error as exc:

@@ -162,0 +172,0 @@ if retries_left > 0:

@@ -33,6 +33,7 @@ import logging

serializers.
:param json: An alternative json module to use for encoding and decoding
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions.
versions. This is a process-wide setting, all instantiated
servers and clients must use the same JSON module.
:param async_handlers: If set to ``True``, event handlers for a client are

@@ -39,0 +40,0 @@ executed in separate threads. To run handlers for a

import re
from engineio import json
from .pubsub_manager import PubSubManager

@@ -26,3 +25,13 @@

default of ``False`` initializes the class for emitting
and receiving.
and receiving. A write-only instance can be used
independently of the server to emit to clients from an
external process.
:param logger: a custom logger to log it. If not given, the server logger
is used.
:param json: An alternative JSON module to use for encoding and decoding
packets. Custom json modules must have ``dumps`` and ``loads``
functions that are compatible with the standard library
versions. This setting is only used when ``write_only`` is set
to ``True``. Otherwise the JSON module configured in the
server is used.

@@ -46,6 +55,4 @@ A zmq message broker must be running for the zmq_manager to work.

def __init__(self, url='zmq+tcp://localhost:5555+5556',
channel='socketio',
write_only=False,
logger=None):
def __init__(self, url='zmq+tcp://localhost:5555+5556', channel='socketio',
write_only=False, logger=None, json=None):
try:

@@ -62,3 +69,4 @@ from eventlet.green import zmq

super().__init__(channel=channel, write_only=write_only, logger=logger)
super().__init__(channel=channel, write_only=write_only, logger=logger,
json=json)
url = url.replace('zmq+', '')

@@ -81,3 +89,3 @@ (sink_url, sub_port) = url.split('+')

def _publish(self, data):
packed_data = json.dumps(
packed_data = self.json.dumps(
{

@@ -101,3 +109,3 @@ 'type': 'message',

try:
message = json.loads(message)
message = self.json.loads(message)
except Exception:

@@ -104,0 +112,0 @@ pass

@@ -8,4 +8,6 @@ import asyncio

from engineio.packet import Packet as EIOPacket
from socketio import async_manager
from socketio import async_pubsub_manager
from socketio import async_server
from socketio import packet

@@ -850,1 +852,12 @@

)
def test_custom_json(self):
saved_json = packet.Packet.json
cm = async_pubsub_manager.AsyncPubSubManager(json='foo')
assert cm.json == 'foo'
async_server.AsyncServer(json='bar', client_manager=cm)
assert cm.json == 'bar'
packet.Packet.json = saved_json
EIOPacket.json = saved_json

@@ -5,4 +5,6 @@ import pytest

from socketio import async_redis_manager
from engineio.packet import Packet as EIOPacket
from socketio import async_redis_manager, AsyncServer
from socketio.async_redis_manager import AsyncRedisManager
from socketio.packet import Packet

@@ -113,1 +115,12 @@

async_redis_manager.aioredis = saved_redis
def test_custom_json(self):
saved_json = Packet.json
cm = AsyncRedisManager('redis://', json='foo')
assert cm.json == 'foo'
AsyncServer(json='bar', client_manager=cm)
assert cm.json == 'bar'
Packet.json = saved_json
EIOPacket.json = saved_json

@@ -8,5 +8,7 @@ import functools

from engineio.packet import Packet as EIOPacket
from socketio import manager
from socketio import pubsub_manager
from socketio import packet
from socketio import server

@@ -826,1 +828,12 @@

)
def test_custom_json(self):
saved_json = packet.Packet.json
cm = pubsub_manager.PubSubManager(json='foo')
assert cm.json == 'foo'
server.Server(json='bar', client_manager=cm)
assert cm.json == 'bar'
packet.Packet.json = saved_json
EIOPacket.json = saved_json

@@ -5,3 +5,5 @@ import pytest

from socketio import redis_manager
from engineio.packet import Packet as EIOPacket
from socketio import redis_manager, Server
from socketio.packet import Packet
from socketio.redis_manager import RedisManager, parse_redis_sentinel_url

@@ -148,1 +150,12 @@

)
def test_custom_json(self):
saved_json = Packet.json
cm = RedisManager('redis://', json='foo')
assert cm.json == 'foo'
Server(json='bar', client_manager=cm)
assert cm.json == 'bar'
Packet.json = saved_json
EIOPacket.json = saved_json
+2
-14

@@ -1,18 +0,5 @@

[tox]
envlist=flake8,py{310,311,312,313,314},docs
skip_missing_interpreters=True
[gh-actions]
python =
3.10: py310
3.11: py311
3.12: py312
3.13: py313
3.14: py314
pypy-3: pypy3
[testenv]
commands=
pip install -e .
pytest -p no:logging --timeout=60 --cov=socketio --cov-branch --cov-report=term-missing --cov-report=xml
pytest -p no:logging --timeout=60 --cov=socketio --cov-branch --cov-report=term-missing --cov-report=xml {posargs}
deps=

@@ -42,2 +29,3 @@ simple-websocket

sphinx
furo
allowlist_externals=

@@ -44,0 +32,0 @@ make