A powerful Python script for checking multiple endpoints across multiple websites with advanced filtering, detailed host information, and flexible output options.
- Batch Checking: Test multiple endpoints across multiple websites in one run
- Advanced Filtering: Filter results by HTTP status codes, keywords, or ranges (e.g., 2xx, 4xx)
- Host Information: Collect IP addresses, page titles, server headers, and redirect locations
- Flexible Input: Accept input from command-line arguments, files, or stdin
- Multiple Output Formats: Display results in console, save to text files, or export to CSV
- Customizable: Set timeouts, delays between requests, and custom User-Agent strings
- Keyword Search: Search for specific keywords in response content
- Python 3.6 or higher
- Dependencies listed in
requirements.txt
- Clone or download this repository
- Install required packages:
pip install -r requirements.txtCheck endpoints with inline parameters:
python endpoint_checker.py -e "/health,/status" -W "site1.com,site2.com"Check endpoints from files:
python endpoint_checker.py -ef endpoints.txt -w websites.txtCheck with detailed output:
python endpoint_checker.py -e "/health" -W "site1.com" --details --host-infoFilter by specific status codes:
python endpoint_checker.py -ef endpoints.txt -w websites.txt --status-codes "200,201,301"Filter by status code ranges:
python endpoint_checker.py -ef endpoints.txt -w websites.txt --status-codes "2xx,3xx"Filter by keyword in response:
python endpoint_checker.py -e "/admin" -w websites.txt -k "login" --status-codes "200"Save URLs only to text file:
python endpoint_checker.py -ef endpoints.txt -w websites.txt --save-txt urls.txt --urls-onlySave minimal details to CSV:
python endpoint_checker.py -e "/health" -w websites.txt --save-csv results.csv --minimal-saveSave full details with host info:
python endpoint_checker.py -e "/health" -w websites.txt --host-info --save-csv detailed.csvSave formatted report:
python endpoint_checker.py -e "/status" -w websites.txt --save-txt report.txtSet timeout and delay:
python endpoint_checker.py -e "/api" -w websites.txt -t 15 -d 0.5Custom User-Agent:
python endpoint_checker.py -e "/health" -w websites.txt --user-agent "MyBot/1.0"| Option | Description |
|---|---|
-e, --endpoints |
Comma-separated endpoints (e.g., "/health,/status") |
-ef, --endpoints-file |
File containing endpoints (one per line) |
-w, --websites-file |
File containing websites (one per line) |
-W, --websites |
Comma-separated websites (e.g., "site1.com,site2.com") |
| Option | Description |
|---|---|
--status-codes |
Filter by HTTP status codes (e.g., "200,301" or "2xx,4xx") |
-k, --keyword |
Keyword to search for in response content |
| Option | Description |
|---|---|
--details |
Show detailed output with response time and size |
--host-info |
Include host information (IP, title, server, location) |
--save-txt FILENAME |
Save results to text file |
--save-csv FILENAME |
Save results to CSV file |
--urls-only |
Save only URLs to text file (no other data) |
--minimal-save |
Save minimal data to files (excludes host info) |
| Option | Description |
|---|---|
-t, --timeout |
Request timeout in seconds (default: 10) |
-d, --delay |
Delay between requests in seconds (default: 0) |
--user-agent |
Custom User-Agent string |
/health
/status
/api/v1
/admin
example.com
sub.example.com
another-site.org
Lines starting with # are treated as comments and ignored. Empty lines are also ignored.
https://example.com/health -> 200 OK
https://example.com/status -> 200 OK
https://example.com/health [200 OK] (0.25s, 1024 bytes)
URL: https://example.com/health
Status: 200 OK
Response Time: 0.25s
Content Length: 1024 bytes
IP Address: 93.184.216.34
Title: Health Check
Server: nginx/1.18.0
The CSV file includes columns:
- website
- endpoint
- url
- status_code
- status_text
- response_time
- content_length
- ip_address
- title
- server
- location
- keyword_found
- Text files: Save full detailed output with all collected data (unless
--urls-onlyspecified) - CSV files: Save all available columns including host info if collected
- Status codes: No filtering applied unless specified
- Timeout: 10 seconds per request
- Delay: No delay between requests (0 seconds)
- Health Monitoring: Check if critical endpoints are responding
- Migration Testing: Verify endpoints work across old and new domains
- Security Auditing: Find exposed admin panels or sensitive endpoints
- Load Testing Prep: Identify which endpoints are available before testing
- Documentation: Generate lists of available endpoints for API documentation
- Redirect Verification: Check if redirects are configured correctly
- Use delays (
-d) when checking many endpoints to avoid overwhelming servers - Start with a longer timeout (
-t) for slower servers - Use
--host-infosparingly as it adds overhead for DNS lookups and parsing - Save results to files for later analysis
- Use status code ranges (2xx, 3xx) for broader filtering
The script handles various errors gracefully:
- Network timeouts
- DNS resolution failures
- Connection errors
- Invalid URLs
- File not found errors
Errors are reported in the summary but don't stop the entire check process.
This script is provided as-is for educational and practical use.
Feel free to submit issues, fork the repository, and create pull requests for any improvements.
- v3.0: Added host information collection, CSV export, advanced filtering, and improved output options
- v2.0: Added file input support and keyword filtering
- v1.0: Initial release with basic endpoint checking