
Research
2025 Report: Destructive Malware in Open Source Packages
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.
proxy-relay
Advanced tools
English | 简体中文
Proxy Relay is a pure-Python proxy relay/conversion tool that can convert upstream HTTP/HTTPS/SOCKS5/SOCKS5H proxies into local HTTP or SOCKS5 proxies. The local proxy itself does not require authentication.
Typical use cases: proxy relay for automation tools such as Playwright, Selenium and DrissionPage.
pip install proxy-relay
To install the latest version from GitHub:
pip install "git+https://github.com/huazz233/proxy_relay.git"
from proxy_relay import create_proxy
import requests
# Create a local proxy from an upstream SOCKS5 proxy
url = create_proxy("socks5://user:pass@proxy.com:1080")
# Use the local proxy to access a test URL
resp = requests.get("https://api.ipify.org/", proxies={"http": url, "https": url})
print(resp.text)
from proxy_relay import create_proxy
url = create_proxy("http://proxy.com:8080")
# Use the proxy...
# Resources will be cleaned up automatically when the process exits
from proxy_relay import create_proxy, cleanup
url = create_proxy("http://proxy.com:8080")
# Use the proxy...
cleanup() # Manually cleanup in long-running services
import asyncio
from proxy_relay import create_proxy_async, HttpProxy
async def main():
# Simple: directly create a local proxy URL
url = await create_proxy_async("http://proxy.com:8080")
# Context manager: automatically start and stop the proxy
async with HttpProxy("http://proxy.com:8080") as proxy:
url = proxy.get_local_url()
# Use the proxy...
asyncio.run(main())
import asyncio
from proxy_relay import create_proxy_async
from playwright.async_api import ProxySettings, async_playwright
UPSTREAM_PROXY = "socks5://user:pass@proxy.com:1080"
TEST_URL = "https://api.ipify.org/"
async def main():
# Convert upstream proxy to a local HTTP proxy
local_url = await create_proxy_async(UPSTREAM_PROXY, local_type="http")
async with async_playwright() as p:
proxy: ProxySettings = {"server": local_url}
browser = await p.chromium.launch(proxy=proxy, headless=False)
page = await browser.new_page()
await page.goto(TEST_URL)
print(await page.text_content("body"))
await browser.close()
asyncio.run(main())
from proxy_relay import create_proxy
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
UPSTREAM_PROXY = "socks5://user:pass@proxy.com:1080"
TEST_URL = "https://api.ipify.org/"
def main():
# Create a local HTTP proxy
local_url = create_proxy(UPSTREAM_PROXY, local_type="http")
options = Options()
options.add_argument(f"--proxy-server={local_url}")
driver = webdriver.Chrome(options=options)
try:
driver.get(TEST_URL)
print(driver.page_source)
finally:
driver.quit()
if __name__ == "__main__":
main()
from proxy_relay import create_proxy
from DrissionPage import ChromiumPage, ChromiumOptions
UPSTREAM_PROXY = "socks5://user:pass@proxy.com:1080"
TEST_URL = "https://api.ipify.org/"
def main():
# Create a local HTTP proxy
local_url = create_proxy(UPSTREAM_PROXY, local_type="http")
options = ChromiumOptions()
# DrissionPage accepts a full proxy URL, e.g. http://127.0.0.1:12345
options.set_proxy(local_url)
page = ChromiumPage(options)
page.get(TEST_URL)
print(page.html)
page.quit()
if __name__ == "__main__":
main()
For more detailed integration examples, see
docs/integration-examples.md.
# Create a proxy
create_proxy(upstream_url, local_type="http", connect_timeout=30.0, idle_timeout=300.0, timeout=30.0)
create_http_proxy(upstream_url, ...) # Shortcut for HTTP proxies
create_socks5_proxy(upstream_url, ...) # Shortcut for SOCKS5 proxies
# Cleanup (optional, resources are also cleaned on process exit)
cleanup()
# Create a proxy
await create_proxy_async(upstream_url, local_type="http", ...)
await create_http_proxy_async(upstream_url, ...)
await create_socks5_proxy_async(upstream_url, ...)
# Context managers
async with HttpProxy(upstream_url) as proxy:
url = proxy.get_local_url()
async with Socks5Proxy(upstream_url) as proxy:
url = proxy.get_local_url()
# ProxyManager - manage multiple proxies
import asyncio
from proxy_relay import ProxyManager
async def main():
async with ProxyManager() as manager:
url = await manager.create(upstream_url, local_type="http")
await manager.stop(url) # Stop a single proxy
await manager.stop_all() # Stop all proxies
asyncio.run(main())
| Upstream | Local | Example |
|---|---|---|
| HTTP/HTTPS | HTTP/SOCKS5 | http://proxy.com:8080 |
| SOCKS5 | HTTP/SOCKS5 | socks5://user:pass@proxy.com:1080 |
| SOCKS5H | HTTP/SOCKS5 | socks5h://proxy.com:1080 |
upstream_url: upstream proxy URLlocal_type: local proxy type ("http" or "socks5")connect_timeout: connection timeout in seconds (default 30)idle_timeout: idle timeout in seconds (default 300)timeout: creation timeout in seconds (default 30)spawn mode)fork mode, it is recommended not to create proxies before forkingQ: Will the proxy keep running?
A: Yes, until the process exits or cleanup() is called.
Q: How to avoid resource leaks in long-running services?
A: Call cleanup() periodically or use ProxyManager to manage proxies.
Q: What's the difference between sync and async APIs?
A: Sync APIs are convenient for simple scripts; async APIs are more suitable for async applications (e.g. FastAPI).
MIT License
FAQs
多协议代理中转器,支持HTTP/HTTPS/SOCKS5/SOCKS5H协议互转,本地代理无需账密认证
We found that proxy-relay demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.

Security News
Socket CTO Ahmad Nassri shares practical AI coding techniques, tools, and team workflows, plus what still feels noisy and why shipping remains human-led.

Research
/Security News
A five-month operation turned 27 npm packages into durable hosting for browser-run lures that mimic document-sharing portals and Microsoft sign-in, targeting 25 organizations across manufacturing, industrial automation, plastics, and healthcare for credential theft.