This is a scalability test framework for the replicated data storage system (aka the "blue box") of the Human Cell Atlas.
- The scalability test framework is based on AWS Step Functions. Workflow definition resembles smoke test for DSS.
- The execution is triggered by sending SNS messages to
dss-scalability-test-run-{STAGE}
topic - The scalability test writes results of execution of individual executions and aggregated run metrics into the
following DynamoDB tables:
scalability_test_result
,scalability_test
- The SFN execution is initiated and starts by entering WAIT step. Wait is configured to end at the 5 minute intervals to accommodate the AWS limit on starting SFN executions and enable generation of bursts of load
- Once all parallel branches of execution are done, it writes summary of the run in DynamoDB
- DynamoDB is configured to stream new records into Lambda which aggregates the results and writes incremental metrics back into DynamoDB
- CloudWatch dashboard has been configured to display relevant execution metrics and deployed automatically. The
dashboard is named as
Scalability{stage}
- Run with default configuration
make scaletest
in the top-leveldata-store
directory. - Run with custom configuration
./tests/scalability/scale_test_runner.py -r <rps> -d <duration_sec>
in the top-leveldata-store
directory, where<rps>
is a number of requests generated per second and<duration_sec>
is the duration of the test in seconds.
New tests can easily be addeed to the existing step function definition at app.py
.