
Security News
Open Source Maintainers Feeling the Weight of the EU’s Cyber Resilience Act
The EU Cyber Resilience Act is prompting compliance requests that open source maintainers may not be obligated or equipped to handle.
NOTE: This repository has been archived and is no longer maintained after urllib3==1.26.2
The ProxyRequests class first scrapes proxies from the web. Then it recursively attempts to make a request if the initial request with a proxy is unsuccessful.
Either copy the code and put where you want it, or download via pip:
pip install proxy-requests
(or pip3)
from proxy_requests import ProxyRequests
or if you need the Basic Auth subclass as well:
from proxy_requests import ProxyRequests, ProxyRequestsBasicAuth
If the above import statement is used, method calls will be identical to the ones shown below. Pass a fully qualified URL when initializing an instance.
System Requirements: Python 3 and the requests module.
Runs on Linux and Windows (and Mac probably) - It may take a moment to run depending on the current proxy.
Each request with a proxy is set with an 3 second timeout in the event that the request takes too long (before trying the next proxy socket in the queue).
Proxies are randomly popped from the queue.
The ProxyRequestBasicAuth subclass has the methods get(), get_with_headers(), post(), post_with_headers(), post_file(), and post_file_with_headers() that will override the Parent methods.
GET:
r = ProxyRequests('https://api.ipify.org')
r.get()
GET with headers:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequests('url here')
r.set_headers(h)
r.get_with_headers()
POST:
r = ProxyRequests('url here')
r.post({'key1': 'value1', 'key2': 'value2'})
POST with headers:
r = ProxyRequests('url here')
r.set_headers({'name': 'rootVIII', 'secret_message': '7Yufs9KIfj33d'})
r.post_with_headers({'key1': 'value1', 'key2': 'value2'})
POST FILE:
r = ProxyRequests('url here')
r.set_file('test.txt')
r.post_file()
POST FILE with headers:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequests('url here')
r.set_headers(h)
r.set_file('test.txt')
r.post_file_with_headers()
GET with Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.get()
GET with headers & Basic Authentication:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_headers(h)
r.get_with_headers()
POST with Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.post({'key1': 'value1', 'key2': 'value2'})
POST with headers & Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_headers({'header_key': 'header_value'})
r.post_with_headers({'key1': 'value1', 'key2': 'value2'})
POST FILE with Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_file('test.txt')
r.post_file()
POST FILE with headers & Basic Authentication:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_headers(h)
r.set_file('test.txt')
r.post_file_with_headers()
Response Methods
Returns a string:
print(r)
Or if you want the raw content as bytes:
r.get_raw()
Get the response as JSON (if valid JSON):
r.get_json()
Get the response headers:
print(r.get_headers())
Get the status code:
print(r.get_status_code())
Get the URL that was requested:
print(r.get_url())
Get the proxy that was used to make the request:
print(r.get_proxy_used())
To write raw data to a file (including an image):
url = 'https://www.restwords.com/static/ICON.png'
r = ProxyRequests(url)
r.get()
with open('out.png', 'wb') as f:
f.write(r.get_raw())
import json
with open('test.txt', 'w') as f:
json.dump(r.get_json(), f)
FAQs
Make HTTP requests with scraped proxies
We found that proxy-requests demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The EU Cyber Resilience Act is prompting compliance requests that open source maintainers may not be obligated or equipped to handle.
Security News
Crates.io adds Trusted Publishing support, enabling secure GitHub Actions-based crate releases without long-lived API tokens.
Research
/Security News
Undocumented protestware found in 28 npm packages disrupts UI for Russian-language users visiting Russian and Belarusian domains.