har-pilot is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.
A HAR (HTTP Archive) file is a JSON-formatted archive file format that contains a record of web requests and responses. HAR files are typically generated by web browsers' developer tools and capture detailed information about network interactions.
The HAR file format was developed to facilitate the export and analysis of HTTP communication sessions. Initially, it was mainly used for debugging and performance analysis by web developers. Over time, it gained widespread adoption due to its ability to provide a comprehensive view of web transactions.
Although HAR does not have an official RFC, it has become a de facto standard for HTTP session recording. The format was primarily influenced by the Web Performance Working Group at the W3C, which sought to create a unified format for logging network requests and responses.
HAR files are widely supported by major web browsers, including Chrome, Firefox, and Edge. Developer tools in these browsers can export network activity as HAR files, making them accessible for various performance analysis and debugging tools.
- Debugging and Performance Analysis: HAR files help developers debug network issues and analyze the performance of web applications.
- Load Testing: By replaying HAR files, developers can simulate real user interactions and test how their application performs under load.
- Detailed Metrics: HAR files provide detailed metrics about each request and response, including headers, cookies, request timings, and more.
- Run HAR Files: Execute HTTP requests captured in HAR files.
- Load Testing: Perform load testing by running HAR files multiple times.
- Detailed Metrics: Collect and store metrics such as response time, status codes, and response bodies in a SQLite database.
- Concurrency: Handle multiple requests concurrently for efficient load testing.
- Progress Tracking: Track progress with a detailed progress bar.
- S3 Upload: Optionally upload the results to an S3 bucket for storage and further analysis.
har-pilot is designed to help developers test their applications under realistic conditions. By replaying HAR files, which capture actual user interactions with a web application, har-pilot can simulate real-world usage patterns. This allows developers to identify performance bottlenecks, assess the impact of concurrent requests, and ensure their applications can handle the expected load.
To get started with har-pilot
, clone the repository and install the dependencies:
git clone https://github.com/copyleftdev/har-pilot.git
cd har-pilot
cargo build
To run a HAR file and execute the HTTP requests contained within it, use the following command:
cargo run -- <path-to-har-file> --itercount <number-of-iterations> [--s3-bucket <bucket-name>]
<path-to-har-file>
: The path to the HAR file you want to run.<number-of-iterations>
: The number of times to run the HAR file for load testing.[--s3-bucket <bucket-name>]
: (Optional) The name of the S3 bucket to upload the results to.
For example:
# Running without S3 upload
cargo run -- example.har --itercount 5
# Running with S3 upload
cargo run -- example.har --itercount 5 --s3-bucket your-s3-bucket-name
har-pilot
stores detailed metrics of each request in a SQLite database. The database file is named with a unique identifier to ensure no conflicts:
- URL: The URL of the request.
- Method: The HTTP method used (e.g., GET, POST).
- Response: The response body.
- Status: The HTTP status code.
- Timestamp: The timestamp when the request was made.
- Duration: The response time in milliseconds.
You can query the SQLite database to analyze the metrics collected during the load testing. Here are some useful queries:
SELECT
strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
AVG(duration_ms) as avg_response_time_ms
FROM
metrics
GROUP BY
strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY
second;
SELECT
strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
COUNT(*) as total_requests
FROM
metrics
GROUP BY
strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY
second;
SELECT
strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
COUNT(*) as total_requests,
AVG(duration_ms) as avg_response_time_ms
FROM
metrics
GROUP BY
strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY
second;