Skip to content

is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.

Notifications You must be signed in to change notification settings

copyleftdev/har-pilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

har-pilot

har-pilot Logo

har-pilot is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.

What is a HAR File?

A HAR (HTTP Archive) file is a JSON-formatted archive file format that contains a record of web requests and responses. HAR files are typically generated by web browsers' developer tools and capture detailed information about network interactions.

Values and History

The HAR file format was developed to facilitate the export and analysis of HTTP communication sessions. Initially, it was mainly used for debugging and performance analysis by web developers. Over time, it gained widespread adoption due to its ability to provide a comprehensive view of web transactions.

RFC Details

Although HAR does not have an official RFC, it has become a de facto standard for HTTP session recording. The format was primarily influenced by the Web Performance Working Group at the W3C, which sought to create a unified format for logging network requests and responses.

Adoption

HAR files are widely supported by major web browsers, including Chrome, Firefox, and Edge. Developer tools in these browsers can export network activity as HAR files, making them accessible for various performance analysis and debugging tools.

Benefits of Using HAR Files

  • Debugging and Performance Analysis: HAR files help developers debug network issues and analyze the performance of web applications.
  • Load Testing: By replaying HAR files, developers can simulate real user interactions and test how their application performs under load.
  • Detailed Metrics: HAR files provide detailed metrics about each request and response, including headers, cookies, request timings, and more.

Features of har-pilot

  • Run HAR Files: Execute HTTP requests captured in HAR files.
  • Load Testing: Perform load testing by running HAR files multiple times.
  • Detailed Metrics: Collect and store metrics such as response time, status codes, and response bodies in a SQLite database.
  • Concurrency: Handle multiple requests concurrently for efficient load testing.
  • Progress Tracking: Track progress with a detailed progress bar.
  • S3 Upload: Optionally upload the results to an S3 bucket for storage and further analysis.

How har-pilot Pairs with Testing Realistic User Workflows

har-pilot is designed to help developers test their applications under realistic conditions. By replaying HAR files, which capture actual user interactions with a web application, har-pilot can simulate real-world usage patterns. This allows developers to identify performance bottlenecks, assess the impact of concurrent requests, and ensure their applications can handle the expected load.

Installation

To get started with har-pilot, clone the repository and install the dependencies:

git clone https://github.com/copyleftdev/har-pilot.git
cd har-pilot
cargo build

Usage

Running HAR Files

To run a HAR file and execute the HTTP requests contained within it, use the following command:

cargo run -- <path-to-har-file> --itercount <number-of-iterations> [--s3-bucket <bucket-name>]
  • <path-to-har-file>: The path to the HAR file you want to run.
  • <number-of-iterations>: The number of times to run the HAR file for load testing.
  • [--s3-bucket <bucket-name>]: (Optional) The name of the S3 bucket to upload the results to.

For example:

# Running without S3 upload
cargo run -- example.har --itercount 5

# Running with S3 upload
cargo run -- example.har --itercount 5 --s3-bucket your-s3-bucket-name

Storing Metrics

har-pilot stores detailed metrics of each request in a SQLite database. The database file is named with a unique identifier to ensure no conflicts:

  • URL: The URL of the request.
  • Method: The HTTP method used (e.g., GET, POST).
  • Response: The response body.
  • Status: The HTTP status code.
  • Timestamp: The timestamp when the request was made.
  • Duration: The response time in milliseconds.

Querying Metrics

You can query the SQLite database to analyze the metrics collected during the load testing. Here are some useful queries:

Average Response Time Per Second

SELECT 
    strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
    AVG(duration_ms) as avg_response_time_ms
FROM 
    metrics
GROUP BY 
    strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY 
    second;

Total Requests Per Second

SELECT 
    strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
    COUNT(*) as total_requests
FROM 
    metrics
GROUP BY 
    strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY 
    second;

Combined Query for Both Average Response Time and Total Requests

SELECT 
    strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
    COUNT(*) as total_requests,
    AVG(duration_ms) as avg_response_time_ms
FROM 
    metrics
GROUP BY 
    strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY 
    second;

About

is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  

Languages