Skip to content

Added execution and updated installation section to Readme #175

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Dec 15, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
238 changes: 157 additions & 81 deletions Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,31 @@
[![CI tests](https://github.com/redis/redis-benchmarks-specification/actions/workflows/tox.yml/badge.svg)](https://github.com/redis/redis-benchmarks-specification/actions/workflows/tox.yml)
[![PyPI version](https://badge.fury.io/py/redis-benchmarks-specification.svg)](https://pypi.org/project/redis-benchmarks-specification)


## Installation

To have access to the latest SPEC and Tooling impletamtion you only need to install one python package.

Installation is done using pip, the package installer for Python, in the following manner:

```bash
python3 -m pip install redis-benchmarks-specification --ignore-installed PyYAML
```
<!-- toc -->

- [Benchmark specifications goal](#benchmark-specifications-goal)
- [Scope](#scope)
- [Installation and Execution](#installation-and-execution)
- [Installing package requirements](#installing-package-requirements)
- [Installing Redis benchmarks specification](#installing-redis-benchmarks-specification-implementations)
- [Testing out the redis-benchmarks-spec-runner](#testing-out-the-redis-benchmarks-spec-runner)
- [Testing out redis-benchmarks-spec-sc-coordinator](#testing-out-redis-benchmarks-spec-sc-coordinator)
- [Architecture diagram](#architecture-diagram)
- [Directory layout](#directory-layout)
- [Specifications](#specifications)
- [Spec tool implementations](#spec-tool-implementations)
- [Contributing guidelines](#contributing-guidelines)
- [Joining the performance initiative and adding a continuous benchmark platform](#joining-the-performance-initiative-and-adding-a-continuous-benchmark-platform)
- [Joining the performance initiative](#joining-the-performance-initiative)
- [Adding a continuous benchmark platform](#adding-a-continuous-benchmark-platform)
- [Adding redis-benchmarks-spec-sc-coordinator to supervisord](#adding-redis-benchmarks-spec-sc-coordinator-to-supervisord)
- [Development](#development)
- [Running formaters](#running-formaters)
- [Running linters](#running-linters)
- [Running tests](#running-tests)
- [License](#license)

<!-- tocstop -->


## Benchmark specifications goal
Expand Down Expand Up @@ -54,100 +69,93 @@ Current supported benchmark tools:
- [SOON][redis-benchmark-go](https://github.com/filipecosta90/redis-benchmark-go)


## Installing Redis benchmarks specification implementations
## Installation and Execution

The Redis benchmarks specification and implementations is developed for Unix and is actively tested on it.
To have access to the latest SPEC and Tooling impletamtion you only need to install one python package.<br />
Before package's installation, please install its' dependencies.

Installation is done using pip, the package installer for Python, in the following manner:
### Installing package requirements

```bash
python3 -m pip install redis-benchmarks-specification
```

## Architecture diagram

![Architecture diagram](./arch-diagram.png)

In a very brief description, github.com/redis/redis upstream changes trigger an HTTP API call containing the
relevant git information.

The HTTP request is then converted into an event ( tracked within redis ) that will trigger multiple build variants requests based upon the distinct platforms described in [`platforms`](redis_benchmarks_specification/setups/platforms/).

As soon as a new build variant request is received, the build agent ([`redis-benchmarks-spec-builder`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__builder__/))
prepares the artifact(s) and proceeds into adding an artifact benchmark event so that the benchmark coordinator ([`redis-benchmarks-spec-sc-coordinator`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__self_contained_coordinator__/)) can deploy/manage the required infrastructure and DB topologies, run the benchmark, and export the performance results.
## Directory layout

### Specifications

The following is a high level status report for currently available specs.

* `redis_benchmarks_specification`
* [`test-suites`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/test-suites/): contains the benchmark suites definitions, specifying the target redis topology, the tested commands, the benchmark utility to use (the client), and if required the preloading dataset steps.

* `redis_benchmarks_specification/setups`
* [`platforms`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/setups/platforms/): contains the standard platforms considered to provide steady stable results, and to represent common deployment targets.
* [`topologies`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/setups/topologies/): contains the standard deployment topologies definition with the associated minimum specs to enable the topology definition.
* [`builders`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/setups/builders/): contains the build environment variations, that enable to build Redis with different compilers, compiler flags, libraries, etc...

### Spec tool implementations

The following is a high level status report for currently available spec implementations.

* **STATUS: Experimental** [`redis-benchmarks-spec-api`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__api__/) : contains the API that translates the POST HTTP request that was triggered by github.com/redis/redis upstream changes, and fetches the relevant git/source info and coverts it into an event ( tracked within redis ).

* **STATUS: Experimental** [`redis-benchmarks-spec-builder`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__builder__/): contains the benchmark build agent utility that receives an event indicating a new build variant, generates the required redis binaries to test, and triggers the benchmark run on the listening agents.

* **STATUS: Experimental** [`redis-benchmarks-spec-sc-coordinator`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__self_contained_coordinator__/): contains the coordinator utility that listens for benchmark suite run requests and setups the required steps to spin the actual benchmark topologies and to trigger the actual benchmarks.

* **STATUS: Experimental** [`redis-benchmarks-spec-client-runner`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__runner__/): contains the client utility that triggers the actual benchmarks against an endpoint provided. This tool is setup agnostic and expects the DB to be properly spinned beforehand.

## Contributing guidelines

### Adding new test suites
# install pip installer for python3
sudo apt install python3-pip -y
sudo pip3 install --upgrade pip

TBD
# install docker
sudo apt install docker.io -y

### Adding new topologies
# install supervisord
sudo apt install supervisor -y
```

TBD

### Joining the performance initiative and adding a continuous benchmark platform
### Installing Redis benchmarks specification

#### Joining the performance initiative
In order to join the performance initiative the only requirement is that you provide a steady-stable infrastructure
platform to run the benchmark suites, and you reach out to one of the Redis Performance Initiative member via
`performance <at> redis <dot> com` so that we can provide you with the required secrets to actively listen for benchmark events.
Installation is done using pip, the package installer for Python, in the following manner:

If you check the above "Architecture diagram", this means you only need to run the last moving part of the arch, meaning you will have
one or more benchmark coordinator machines actively running benchmarks and pushing the results back to our datasink.
```bash
python3 -m pip install redis-benchmarks-specification --ignore-installed PyYAML
```

#### Adding a continuous benchmark platform
To run particular version - use its number, e.g. 0.1.57:
```bash
pip3 install redis-benchmarks-specification==0.1.57
```

In order to be able to run the benchmarks on the platform you need pip installer for python3, and docker.
Apart from it, we recommend you manage the `redis-benchmarks-spec-sc-coordinator` process(es) state via a process monitoring tool like
supervisorctl, lauchd, daemon tools, or other.

For this example we relly uppon `supervisorctl` for process managing.
### Testing out the redis-benchmarks-spec-client-runner

##### Installing package requirements
There is an option to run "redis-benchmarks-spec" tests using standalone runner approach. For this option redis-benchmarks-specificaiton should be run together with redis-server in the same time.

```bash
# install pip installer for python3
sudo apt install python3-pip -y
sudo pip3 install --upgrade pip
# Run redis server
./src/redis-server --port 6379 --dir logs --logfile server.log --save "" [--daemonize yes]

# install docker
sudo apt install docker.io -y
# Run benchmark
redis-benchmarks-spec-client-runner --db_server_host localhost --db_server_port 6379 --client_aggregated_results_folder ./test
```

# install supervisord
sudo apt install supervisor -y
Option "--daemonize yes" given to server run command allows to run redis-server in background.<br />
Option "--test X.yml" given to benchmark execution command allows to run particular test, where X - test name

# install benchmark specs
python3 -m pip install redis-benchmarks-specification --ignore-installed PyYAML
Full list of option can be taken with "-h" option:
```
$ redis-benchmarks-spec-client-runner -h
usage: redis-benchmarks-spec-client-runner [-h]
[--platform-name PLATFORM_NAME]
[--triggering_env TRIGGERING_ENV]
[--setup_type SETUP_TYPE]
[--github_repo GITHUB_REPO]
[--github_org GITHUB_ORG]
[--github_version GITHUB_VERSION]
[--logname LOGNAME]
[--test-suites-folder TEST_SUITES_FOLDER]
[--test TEST]
[--db_server_host DB_SERVER_HOST]
[--db_server_port DB_SERVER_PORT]
[--cpuset_start_pos CPUSET_START_POS]
[--datasink_redistimeseries_host DATASINK_REDISTIMESERIES_HOST]
[--datasink_redistimeseries_port DATASINK_REDISTIMESERIES_PORT]
[--datasink_redistimeseries_pass DATASINK_REDISTIMESERIES_PASS]
[--datasink_redistimeseries_user DATASINK_REDISTIMESERIES_USER]
[--datasink_push_results_redistimeseries] [--profilers PROFILERS]
[--enable-profilers] [--flushall_on_every_test_start]
[--flushall_on_every_test_end]
[--preserve_temporary_client_dirs]
[--client_aggregated_results_folder CLIENT_AGGREGATED_RESULTS_FOLDER]
[--tls]
[--tls-skip-verify]
[--cert CERT]
[--key KEY]
[--cacert CACERT]
redis-benchmarks-spec-client-runner (solely client) 0.1.61
...
```

### Testing out redis-benchmarks-spec-sc-coordinator

##### Testing out redis-benchmarks-spec-sc-coordinator
Alternative way of running redis-server for listeting is running via redis-benchmarks coordinator.

You should now be able to print the following installed benchmark runner help:

Expand Down Expand Up @@ -237,6 +245,74 @@ $ poetry run redis-benchmarks-spec-sc-coordinator --platform-name example-platfo

You're now actively listening for benchmarks requests to Redis!



## Architecture diagram

![Architecture diagram](./arch-diagram.png)

In a very brief description, github.com/redis/redis upstream changes trigger an HTTP API call containing the
relevant git information.

The HTTP request is then converted into an event ( tracked within redis ) that will trigger multiple build variants requests based upon the distinct platforms described in [`platforms`](redis_benchmarks_specification/setups/platforms/).

As soon as a new build variant request is received, the build agent ([`redis-benchmarks-spec-builder`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__builder__/))
prepares the artifact(s) and proceeds into adding an artifact benchmark event so that the benchmark coordinator ([`redis-benchmarks-spec-sc-coordinator`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__self_contained_coordinator__/)) can deploy/manage the required infrastructure and DB topologies, run the benchmark, and export the performance results.
## Directory layout

### Specifications

The following is a high level status report for currently available specs.

* `redis_benchmarks_specification`
* [`test-suites`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/test-suites/): contains the benchmark suites definitions, specifying the target redis topology, the tested commands, the benchmark utility to use (the client), and if required the preloading dataset steps.

* `redis_benchmarks_specification/setups`
* [`platforms`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/setups/platforms/): contains the standard platforms considered to provide steady stable results, and to represent common deployment targets.
* [`topologies`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/setups/topologies/): contains the standard deployment topologies definition with the associated minimum specs to enable the topology definition.
* [`builders`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/setups/builders/): contains the build environment variations, that enable to build Redis with different compilers, compiler flags, libraries, etc...

### Spec tool implementations

The following is a high level status report for currently available spec implementations.

* **STATUS: Experimental** [`redis-benchmarks-spec-api`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__api__/) : contains the API that translates the POST HTTP request that was triggered by github.com/redis/redis upstream changes, and fetches the relevant git/source info and coverts it into an event ( tracked within redis ).

* **STATUS: Experimental** [`redis-benchmarks-spec-builder`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__builder__/): contains the benchmark build agent utility that receives an event indicating a new build variant, generates the required redis binaries to test, and triggers the benchmark run on the listening agents.

* **STATUS: Experimental** [`redis-benchmarks-spec-sc-coordinator`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__self_contained_coordinator__/): contains the coordinator utility that listens for benchmark suite run requests and setups the required steps to spin the actual benchmark topologies and to trigger the actual benchmarks.

* **STATUS: Experimental** [`redis-benchmarks-spec-client-runner`](https://github.com/filipecosta90/redis-benchmarks-specification/tree/main/redis_benchmarks_specification/__runner__/): contains the client utility that triggers the actual benchmarks against an endpoint provided. This tool is setup agnostic and expects the DB to be properly spinned beforehand.

## Contributing guidelines

### Adding new test suites

TBD

### Adding new topologies

TBD

### Joining the performance initiative and adding a continuous benchmark platform

#### Joining the performance initiative
In order to join the performance initiative the only requirement is that you provide a steady-stable infrastructure
platform to run the benchmark suites, and you reach out to one of the Redis Performance Initiative member via
`performance <at> redis <dot> com` so that we can provide you with the required secrets to actively listen for benchmark events.

If you check the above "Architecture diagram", this means you only need to run the last moving part of the arch, meaning you will have
one or more benchmark coordinator machines actively running benchmarks and pushing the results back to our datasink.

#### Adding a continuous benchmark platform

In order to be able to run the benchmarks on the platform you need pip installer for python3, and docker.
Apart from it, we recommend you manage the `redis-benchmarks-spec-sc-coordinator` process(es) state via a process monitoring tool like
supervisorctl, lauchd, daemon tools, or other.

For this example we relly uppon `supervisorctl` for process managing.


##### Adding redis-benchmarks-spec-sc-coordinator to supervisord

Let's add a supervisord entry as follow
Expand Down