
Security News
OpenClaw Skill Marketplace Emerges as Active Malware Vector
Security researchers report widespread abuse of OpenClaw skills to deliver info-stealing malware, exposing a new supply chain risk as agent ecosystems scale.
python-proxy-headers
Advanced tools
Handle custom proxy headers for http & https requests in various python libraries
Extensions for Python HTTP libraries to support sending and receiving custom proxy headers during HTTPS CONNECT tunneling.
When making HTTPS requests through a proxy, the connection is established via a CONNECT tunnel. During this process:
Sending headers to the proxy - Most Python HTTP libraries don't provide an easy way to send custom headers (like X-ProxyMesh-Country) to the proxy server during the CONNECT handshake.
Receiving headers from the proxy - The proxy's response headers from the CONNECT request are typically discarded, making it impossible to read custom headers (like X-ProxyMesh-IP) that the proxy sends back.
This library solves both problems for popular Python HTTP libraries.
| Library | Module | Use Case |
|---|---|---|
| urllib3 | urllib3_proxy_manager | Low-level HTTP client |
| requests | requests_adapter | Simple HTTP requests |
| aiohttp | aiohttp_proxy | Async HTTP client |
| httpx | httpx_proxy | Modern HTTP client |
| pycurl | pycurl_proxy | libcurl bindings |
| cloudscraper | cloudscraper_proxy | Cloudflare bypass |
| autoscraper | autoscraper_proxy | Automatic web scraping |
pip install python-proxy-headers
Then install the HTTP library you want to use (e.g., pip install requests).
Note: This package has no dependencies by default - install only what you need.
from python_proxy_headers.requests_adapter import ProxySession
with ProxySession(proxy_headers={'X-ProxyMesh-Country': 'US'}) as session:
session.proxies = {'https': 'http://user:pass@proxy.example.com:8080'}
response = session.get('https://httpbin.org/ip')
# Proxy headers are merged into response.headers
print(response.headers.get('X-ProxyMesh-IP'))
from python_proxy_headers.httpx_proxy import get
response = get(
'https://httpbin.org/ip',
proxy='http://user:pass@proxy.example.com:8080'
)
# Proxy CONNECT response headers are merged into response.headers
print(response.headers.get('X-ProxyMesh-IP'))
import asyncio
from python_proxy_headers.aiohttp_proxy import ProxyClientSession
async def main():
async with ProxyClientSession() as session:
async with session.get(
'https://httpbin.org/ip',
proxy='http://user:pass@proxy.example.com:8080'
) as response:
# Proxy headers merged into response.headers
print(response.headers.get('X-ProxyMesh-IP'))
asyncio.run(main())
import pycurl
from python_proxy_headers.pycurl_proxy import set_proxy_headers, HeaderCapture
c = pycurl.Curl()
c.setopt(pycurl.URL, 'https://httpbin.org/ip')
c.setopt(pycurl.PROXY, 'http://proxy.example.com:8080')
# Add these two lines to any existing pycurl code
set_proxy_headers(c, {'X-ProxyMesh-Country': 'US'})
capture = HeaderCapture(c)
c.perform()
print(capture.proxy_headers) # Headers from proxy CONNECT response
c.close()
from python_proxy_headers.cloudscraper_proxy import create_scraper
# Drop-in replacement for cloudscraper.create_scraper()
scraper = create_scraper(proxy_headers={'X-ProxyMesh-Country': 'US'})
scraper.proxies = {'https': 'http://proxy.example.com:8080'}
response = scraper.get('https://example.com')
# All CloudScraper features (Cloudflare bypass) preserved
A test harness is included to verify proxy header functionality:
# Set your proxy
export PROXY_URL='http://user:pass@proxy.example.com:8080'
# Test all modules
python test_proxy_headers.py
# Test specific modules
python test_proxy_headers.py requests httpx
# Verbose output (show header values)
python test_proxy_headers.py -v
For detailed documentation, API reference, and more examples:
Created by ProxyMesh to help our customers use custom headers to control proxy behavior. Works with any proxy that supports custom headers.
MIT License
FAQs
Handle custom proxy headers for http & https requests in various python libraries
We found that python-proxy-headers demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Security researchers report widespread abuse of OpenClaw skills to deliver info-stealing malware, exposing a new supply chain risk as agent ecosystems scale.

Security News
Claude Opus 4.6 has uncovered more than 500 open source vulnerabilities, raising new considerations for disclosure, triage, and patching at scale.

Research
/Security News
Malicious dYdX client packages were published to npm and PyPI after a maintainer compromise, enabling wallet credential theft and remote code execution.