SkillAgentSearch skills...

Proxyreaper

A powerful proxy checker that checks and evaluates proxies for speed, availability, and anonymity across multiple protocols (HTTP, HTTPS, SOCKS4, SOCKS5).

Install / Use

/learn @rtulke/Proxyreaper

README

Proxy Reaper - Documentation

Proxy Reaper Banner

Proxy Reaper is a powerful tool for checking proxy servers for availability, speed and anonymity. It supports various protocols such as HTTP, HTTPS, SOCKS4, and SOCKS5, and offers enhanced features for managing and testing proxies efficiently.

Table of Contents

  1. Installation
  2. Installing OS wide (Debian based Distributions)
  3. Basic Usage
  4. Command Line Arguments
  5. Configuration File
  6. Proxy Formats and Sources
  7. Speed Categories
  8. Anonymity Levels
  9. Filter Options
  10. Output Formats
  11. Advanced Features
  12. Troubleshooting
  13. Examples

Installation

Prerequisites

  • Python 3.6 or higher
  • Required Python packages:
    • requests
    • PySocks
    • colorama

Download and Installation

# Clone the repository (if using Git)
git clone https://github.com/rtulke/proxyreaper.git
cd proxyreaper

# Or download the script directly
wget https://raw.githubusercontent.com/rtulke/proxyreaper/main/proxyreaper.py
chmod +x proxyreaper.py

Installing Dependencies

pip install requests PySocks colorama

or use the requirements.txt file:

Linux / MacOS

cd proxyreaper
chmod +x proxyreaper.py
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt

Windows

cd proxyreaper
python -m venv venv
venv\Scripts\activate
pip install -r requirements.txt

Installing OS wide

The system wide installation works for the most Debian based Distributions, like Debian, Ubuntu, Mint, Raspberry PI OS, Kali Linux...

# start as root or try: "sudo su -" or "sudo -i"
su - root
cd ~

# create dev directory for development stuff if needed
mkdir dev
cd dev

# download via git
git clone https://github.com/rtulke/proxyreaper.git
cd proxyreaper
chmod +x proxyreaper.py

# install dependencies
apt install python3-socks python3-colorama

# copy proxyreaper script to `/usr/local/bin`
cp proxyreaper.py /usr/local/bin/proxyreaper

# install man page and updating mandb
cp proxyreaper.1 /usr/local/share/man/man1/
mandb

# use the proxyreaper script from any directory
proxyreaper -h

# generate new config file
proxyreaper --config

# you can also try to edit the new generated configuration file
vim ~/.proxyreaper.conf

# try using the manual
man proxyreaper

Basic Usage

# Test a single proxy
python proxyreaper.py https://www.google.com -p 1.2.3.4:8080

# Test multiple proxies from a file
python proxyreaper.py https://www.google.com -p proxies.txt

# Test multiple files using glob patterns
python proxyreaper.py https://www.google.com -p "proxies/*.txt"

# Create a default configuration file
python proxyreaper.py --config

# Use automatic mode to download proxies from URLs defined in config [proxysources]
python proxyreaper.py https://www.google.com -A

# Filter and save only ultrafast proxies from Germany
python proxyreaper.py https://www.google.com -p proxies.txt --filter-status ultrafast --filter-country de

Command Line Arguments

Core Arguments

| Argument | Short | Type | Description | |----------|-------|------|-------------| | url | - | positional | URL to test the proxies against | | --proxy | -p | string | Proxy, file with proxies, or glob pattern (e.g., *.txt, proxies[1-5].txt) | | --timeout | -t | integer | Timeout in seconds (default from config) | | --output | -o | choice | Save results format: json, csv, or sqlite (default: csv) | | --response-time | -R | float | Filter for fast proxies (maximum response time in milliseconds) | | --concurrent | -c | integer | Number of concurrent checks (default: 10) | | --debug | -d | flag | Enable detailed debug output | | --version | -v | flag | Display version information and exit | | --automatic-mode | -A | flag | Download proxy lists from configured URLs | | --config | -C | flag | Create default config file in ~/.proxyreaper.conf | | --reverse-lookup | -l | flag | Enable reverse DNS lookup for proxy IPs (slower but shows hostnames) |

Filter Arguments

| Argument | Type | Values | Description | |----------|------|--------|-------------| | --filter-status | multi | ultrafast, fast, medium, slow | Filter by speed category (can combine multiple) | | --filter-anonymity | multi | highanonymous, anonymous, headerleak, transparent | Filter by anonymity level (can combine multiple) | | --filter-protocol | multi | http, https, socks4, socks5 | Filter by protocol type (can combine multiple) | | --filter-country | multi | ISO codes | Filter by country code, e.g., de us uk fr (can combine multiple) | | --filter-tld | multi | country codes | Filter by country TLD based on GeoIP, e.g., de us uk (can combine multiple) |

Configuration File

Proxy Reaper supports configuration files to store frequently used settings. The configuration files are searched in the following order:

  1. ~/.proxyreaper.conf (user-specific configuration)
  2. /etc/proxyreaper.conf (system-wide configuration)
  3. Default values (if no configuration file is found)

Creating a Configuration File

You can create a default configuration file using:

python proxyreaper.py --config

This will create a file at ~/.proxyreaper.conf with default values.

Configuration File Format

The configuration file uses the INI format with the following sections:

[general]
timeout = 5
concurrent = 10
response_time_filter = 1000
test_url = https://www.google.com

[output]
format = csv
save_directory = results

[proxysources]
urls = https://raw.githubusercontent.com/username/proxy-list/main/proxies.txt, https://some-proxy-list.com/proxies.txt

[advanced]
debug = false
anonymity_check_url = https://httpbin.org/get

Configuration Sections Explained

[general]

  • timeout: Connection timeout in seconds (default: 5)
  • concurrent: Number of concurrent proxy checks (default: 10)
  • response_time_filter: Maximum response time in milliseconds for filtering (default: 1000)
  • test_url: URL to use for testing proxies (default: https://www.google.com)

[output]

  • format: Default output format - json, csv, or sqlite (default: csv)
  • save_directory: Directory to save results (default: results)

[proxysources]

  • urls: Comma-separated list of URLs to download proxy lists from when using automatic mode

[advanced]

  • debug: Enable detailed debug output by default (true/false)
  • anonymity_check_url: URL to use for anonymity checks (default: https://httpbin.org/get)

Proxy Formats and Sources

Supported Proxy Formats

Proxy Reaper supports several proxy formats:

  • host:port (e.g., 127.0.0.1:8080) - Defaults to HTTP protocol
  • protocol://host:port (e.g., http://127.0.0.1:8080)
  • protocol://username:password@host:port (e.g., http://user:pass@127.0.0.1:8080)

Supported protocols:

  • HTTP
  • HTTPS
  • SOCKS4
  • SOCKS5

Proxy Input Methods

1. Single Proxy

Directly specify a proxy on the command line:

python proxyreaper.py https://www.google.com -p 127.0.0.1:8080

2. Multiple Proxies (Comma-separated)

Use comma-separated list:

python proxyreaper.py https://www.google.com -p "127.0.0.1:8080,192.168.1.1:3128"

3. Single Text File

Provide a file with one proxy per line:

python proxyreaper.py https://www.google.com -p proxies.txt

4. Multiple Files (Glob Patterns)

Use glob patterns to match multiple files:

# All .txt files in a directory
python proxyreaper.py https://www.google.com -p "proxies/*.txt"

# Files matching a specific pattern
python proxyreaper.py https://www.google.com -p "proxylist*.txt"

# Files with numbered ranges
python proxyreaper.py https://www.google.com -p "proxylist[1-5].txt"

# Complex patterns
python proxyreaper.py https://www.google.com -p "../sources/proxy_*.txt"

Supported glob patterns:

  • * - Matches any characters
  • ? - Matches single character
  • [1-5] - Matches range of characters
  • [abc] - Matches specific characters

5. Automatic Download

Use automatic mode to download proxies from URLs specified in the configuration:

python proxyreaper.py https://www.google.com -A

Example Proxy List File

A proxy list file (e.g., proxies.txt) should contain one proxy per line:

# HTTP proxies
http://192.168.1.1:8080
http://user:pass@192.168.1.2:8080

# HTTPS proxies
https://192.168.1.3:443

# SOCKS proxies
socks4://192.168.1.4:1080
socks5://192.168.1.5:1080

# Without protocol (defaults to HTTP)
192.168.1.6:8080

Lines starting with # are treated as comments and ignored.

Speed Categories

Proxy Reaper categorizes proxies into four speed categories based on their response time:

| Category | Response Time | Description | Use Case | |----------|---------------|-------------|----------| | Ultrafast | < 100ms | Extremely fast proxies | Real-time applications, streaming, gaming | | Fast | 100-500ms | Fast proxies | Web browsing, API calls, general use | | Medium | 500-1000ms | Medium speed proxies | Background tasks, batch processing | | Slow | > 1000ms | Slow proxies | Non-time-critical tasks |

Speed Category Output

The speed category is included in all output formats:

CSV Output:

proxy,hostname,status,speed_category,response_time,country,city,anonymity,protocol,check_time
http://1.2.3.4:8080,1.2.3.4,working,ultrafast,87.5,Germany,Berlin,High Anonymous,http,2025-10-31 14:30:45

Console Output:

[1/100] ULTRAFAST - http://1.2.3.4:8080 (Germany, Berlin, High Anonymous) - 87 ms
[2/100] FAST - h

Related Skills

View on GitHub
GitHub Stars12
CategoryDevelopment
Updated5mo ago
Forks0

Languages

Python

Security Score

92/100

Audited on Oct 31, 2025

No findings