Skip to content

Commit

Permalink
Fix # 743: [DX] VisualStudio linter complains about README formatting (
Browse files Browse the repository at this point in the history
…#745)

Fixes #743.
- Fixed warnings in https://marketplace.visualstudio.com/items?itemName=DavidAnson.vscode-markdownlint
- And the relative links in readme in particular
  • Loading branch information
Pfed-prog authored Mar 7, 2024
1 parent 47b9249 commit 3919ac2
Show file tree
Hide file tree
Showing 19 changed files with 266 additions and 164 deletions.
9 changes: 7 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@ SPDX-License-Identifier: Apache-2.0
- **[Run predictoor bot](READMEs/predictoor.md)** - make predictions, make $
- **[Run trader bot](READMEs/trader.md)** - consume predictions, trade, make $


(If you're a predictoor or trader, you can safely ignore the rest of this README.)

## Settings: PPSS

A "ppss" yaml file, like [`ppss.yaml`](ppss.yaml), holds parameters for all bots and simulation flows.

- We follow the idiom "pp" = problem setup (what to solve), "ss" = solution strategy (how to solve).
- `PRIVATE_KEY` is an exception; it's set as an envvar.

Expand All @@ -33,11 +33,13 @@ cp logging.yaml my_logging.yaml
(First, [install pdr-backend](READMEs/predictoor.md#install-pdr-backend-repo) first.)

To see CLI options, in console:

```console
pdr
```

This will output something like:

```text
Usage: pdr sim|predictoor|trader|..
Expand Down Expand Up @@ -71,11 +73,13 @@ Main tools:
This repo implements all bots in Predictoor ecosystem. Here are each of the sub-directories in the repo.

Main bots & user tools:

- `predictoor` - submit individual predictions
- `trader` - buy aggregated predictions, then trade
- `sim` - experiments / simulation flow

OPF-run bots & higher-level tools:

- `trueval` - report true values to contract
- `dfbuyer` - buy feeds on behalf of Predictoor DF
- `publisher` - publish pdr data feeds
Expand All @@ -85,14 +89,15 @@ OPF-run bots & higher-level tools:
- `accuracy` - calculates % correct, for display in predictoor.ai webapp

Mid-level building blocks:

- `cli` - implementation of CLI
- `ppss` - implements settings
- `aimodel` - AI/ML modeling engine
- `lake` - data lake and data pipeline
- `subgraph` - blockchain queries, complements lake

Lower-level utilities:

- `contract` - classes to wrap blockchain contracts
- `models` - simple widely-used data structures
- `util` - function-based tools

38 changes: 18 additions & 20 deletions READMEs/agent-deployer.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,50 +5,52 @@
## Usage

### Agent Configurations
Firstly, you need to set up your agents configuration. This is done by creating a config entry under `deployment_configs` in `ppss.yaml` file.

Firstly, you need to set up your agents configuration. This is done by creating a config entry under `deployment_configs` in `ppss.yaml` file.

Here is an example structure for your reference:

```yaml
deployment_configs:
testnet_predictoor_deployment:
cpu: '1'
memory: '512Mi'
cpu: "1"
memory: "512Mi"
source: "binance"
type: "predictoor"
approach: 3
network: "sapphire-testnet"
s_until_epoch_end: 20
pdr_backend_image_source: "oceanprotocol/pdr-backend:latest"
agents:
- pair: 'BTC/USDT'
- pair: "BTC/USDT"
stake_amt: 15
timeframe: 5m
approach: 1
- pair: 'ETH/USDT'
- pair: "ETH/USDT"
stake_amt: 20
timeframe: 1h
s_until_epoch_end: 60
```
*Tip: Specific agent settings (like source, timeframe) will override general settings if provided.*
_Tip: Specific agent settings (like source, timeframe) will override general settings if provided._
### Private Keys
Create a `.keys.json` file and format it as follows:
```

```json
{
"config_name": ["pk1", "pk2"...]
}
```

*Note: If you have fewer private keys than number of agents, the tool will create new wallets and update the .keys.json file.*
_Note: If you have fewer private keys than number of agents, the tool will create new wallets and update the .keys.json file._

### Generate Templates

The `generate` command is used to create deployment template files based on a configuration file.

```
```console
pdr deployer generate <config_path> <config_name> <deployment_method> <output_dir>
```

Expand All @@ -63,7 +65,7 @@ Take a note of the `config_name`, you will need it later!

The `deploy` command is used to deploy agents based on a specified config name.

```
```console
pdr deployer deploy <config_name> [-p PROVIDER] [-r REGION] [--project_id PROJECT_ID] [--resource_group RESOURCE_GROUP] [--subscription_id SUBSCRIPTION_ID]
```

Expand All @@ -78,7 +80,7 @@ pdr deployer deploy <config_name> [-p PROVIDER] [-r REGION] [--project_id PROJEC

The `destroy` command is used to destroy agents deployed based on a specified configuration.

```
```console
pdr deployer destroy <config_name> [-p PROVIDER]
```

Expand All @@ -89,19 +91,18 @@ pdr deployer destroy <config_name> [-p PROVIDER]

The `logs` command is used to retrieve logs from deployed agents.

```
```console
pdr deployer logs <config_name> [-p PROVIDER]
```

- `<config_name>`: Name of the config.
- -p, --provider: Cloud provider (optional, choices: "aws", "azure", "gcp"). (optional)

### Remote Container Registry

The `registry` command is used to manage remote registries for agent deployment.

```
```console
pdr deployer registry <action> <registry_name> [-p PROVIDER] [-r REGION] [--project_id PROJECT_ID] [--resource_group RESOURCE_GROUP]
```

Expand All @@ -112,32 +113,29 @@ pdr deployer registry <action> <registry_name> [-p PROVIDER] [-r REGION] [--proj
- --project_id: Google Cloud project id (optional).
- --resource_group: Azure resource group (optional).

### Build

The build command is used to build a container image.

```
```console
pdr deployer build <image_name> <image_tag>
```

- `<image_name>`: Image name (default: "pdr_backend").
- `<image_tag>`: Image tag (default: "deployer").

#### Push

The `push` command is used to push container images to a remote registry.

```
```console
pdr deployer push <registry_name> [<image_name>] [<image_tag>]
```

- `<registry_name>`: Registry name.
- `<image_name>`: Image name (default: "pdr_backend").
- `<image_tag>`: Image tag (default: "deployer").

## Examples

### K8S with GCP
Expand Down Expand Up @@ -177,4 +175,4 @@ pdr-predictoor2-3-eth-usdt-5m-binance-21dfcf3bc4-b6nnk 1/1 Running 0
====================================================================================================================================================================================
cur_epoch=5688716, cur_block_number=4658908, cur_timestamp=1706615099, next_slot=1706615100, target_slot=1706615400. 295 s left in epoch (predict if <= 30 s left). s_per_epoch=300
...
```
```
32 changes: 15 additions & 17 deletions READMEs/barge-calls.md
Original file line number Diff line number Diff line change
@@ -1,35 +1,33 @@
### Barge flow of calls
# Barge flow of calls

From getting barge going, here's how it calls specific pdr-backend components and passes arguments.

- user calls `/barge/start_ocean.sh` to get barge going
- then, `start_ocean.sh` fills `COMPOSE_FILES` incrementally. Eg `COMPOSE_FILES+=" -f ${COMPOSE_DIR}/pdr-publisher.yml"`
- `barge/compose-files/pdr-publisher.yml` sets:
- `pdr-publisher: image: oceanprotocol/pdr-backend:${PDR_BACKEND_VERSION:-latest}`
- `pdr-publisher: command: publisher`
- `pdr-publisher: networks: backend: ipv4_address: 172.15.0.43`
- `pdr-publisher: environment:`
- `RPC_URL: ${NETWORK_RPC_URL}` (= `http://localhost:8545` via `start_ocean.sh`)
- `ADDRESS_FILE: /root/.ocean/ocean-contracts/artifacts/address.json`
- (many `PRIVATE_KEY_*`)
- `barge/compose-files/pdr-publisher.yml` sets:
- `pdr-publisher: image: oceanprotocol/pdr-backend:${PDR_BACKEND_VERSION:-latest}`
- `pdr-publisher: command: publisher`
- `pdr-publisher: networks: backend: ipv4_address: 172.15.0.43`
- `pdr-publisher: environment:`
- `RPC_URL: ${NETWORK_RPC_URL}` (= `http://localhost:8545` via `start_ocean.sh`)
- `ADDRESS_FILE: /root/.ocean/ocean-contracts/artifacts/address.json`
- (many `PRIVATE_KEY_*`)
- then, `start_ocean.sh` pulls the `$COMPOSE_FILES` as needed:
- `[ ${FORCEPULL} = "true" ] && eval docker-compose "$DOCKER_COMPOSE_EXTRA_OPTS" --project-name=$PROJECT_NAME "$COMPOSE_FILES" pull`

- then, `start_ocean.sh` runs docker-compose including all `$COMPOSE_FILES`:
- `eval docker-compose "$DOCKER_COMPOSE_EXTRA_OPTS" --project-name=$PROJECT_NAME "$COMPOSE_FILES" up --remove-orphans`
- it executes each of the `"command"` entries in compose files.
- (Eg for pdr-publisher.yml, `"command" = "publisher ppss.yaml development"`)
- Which then goes to `pdr-backend/entrypoint.sh` via `"python /app/pdr_backend/pdr $@"`
- (where `@` is unpacked as eg `publisher ppss.yaml development`) [Ref](https://superuser.com/questions/1586997/what-does-symbol-mean-in-the-context-of#:).
- Then it goes through the usual CLI at `pdr-backend/pdr_backend/util/cli_module.py`
- (Eg for pdr-publisher.yml, `"command" = "publisher ppss.yaml development"`)
- Which then goes to `pdr-backend/entrypoint.sh` via `"python /app/pdr_backend/pdr $@"`
- (where `@` is unpacked as eg `publisher ppss.yaml development`) [Ref](https://superuser.com/questions/1586997/what-does-symbol-mean-in-the-context-of#:).
- Then it goes through the usual CLI at `pdr-backend/pdr_backend/util/cli_module.py`


### How to make changes to calls
## How to make changes to calls

If you made a change to pdr-backend CLI interface, then barge must call using the updated CLI command.

How:

- change the relevant compose file's `"command"`. Eg change `barge/compose-files/pdr-publisher.yml`'s "command" value to `publisher ppss.yaml development`
- also, change envvar setup as needed. Eg in compose file, remove `RPC_URL` and `ADDRESS_FILE` entry.
- ultimately, ask: "does Docker have everything it needs to succesfully run the component?"
Expand Down
11 changes: 9 additions & 2 deletions READMEs/barge.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,18 @@ Barge is a Docker container to run a local Ganache network having Predictoor con
## Contents

Main:

- [Install Barge](#install-barge)

Reference: how to run barge with...

- [No agents](#barge-basic) - just ganache network & Predictoor contracts
- [One agent](#barge-one-agent) - eg trueval bot
- [All agents](#barge-all-agents) - predictoor, trader, trueval, dfbuyer bots

Finally:
- [Change Barge Itself](#change-barge-itself)

- [Change Barge Itself](#change-barge-itself)

## Install Barge

Expand All @@ -34,13 +36,14 @@ Finally:
git clone https://github.com/oceanprotocol/barge
cd barge

# Clean up previous Ocean-related dirs & containers (optional but recommended)
# Clean up previous Ocean-related dirs & containers (optional but recommended)
rm -rf ~/.ocean
./cleanup.sh
docker system prune -a --volumes
```

**Then, get Docker running.** To run barge, you need the Docker engine running. Here's how:

- If you're on Linux: you're good, there's nothing extra to do

Congrats! Barge is installed and ready to be run.
Expand All @@ -50,10 +53,12 @@ The sections below describe different ways to run barge. They're for reference o
## Barge Basic

Barge with basic Predictoor components is:

- local chain (ganache)
- predictoor-related smart contracts deployed to chain

To run this, go to the barge console and:

```console
./start_ocean.sh --no-provider --no-dashboard --predictoor --with-thegraph
```
Expand All @@ -65,6 +70,7 @@ When barge runs, it will auto-publish DT3 tokens. Currently this is {`BTC/USDT`,
Barge can run with any subset of the agents.

For example, to run barge with just trueval agent:

```console
./start_ocean.sh --no-provider --no-dashboard --predictoor --with-thegraph --with-pdr-trueval
```
Expand All @@ -78,6 +84,7 @@ To run with all agents:
```

This will run all of the following at once:

- local chain (ganache)
- predictoor-related smart contracts deployed to chain
- trueval agent
Expand Down
25 changes: 15 additions & 10 deletions READMEs/clean-code.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,16 @@ SPDX-License-Identifier: Apache-2.0

Guidelines for core devs to have clean code.

Main policy on PRs:
Main policy on PRs:

> **To merge a PR, it must be clean.** ✨ (Rather than: "merge the PR, then clean up")
> **To merge a PR, it must be clean.** ✨ (Rather than: "merge the PR, then clean up")
Clean code enables us to proceed with maximum velocity.

## Summary

Clean code means:

- No DRY violations
- Great labels
- Dynamic type-checking
Expand All @@ -31,6 +32,7 @@ This ensures minimal tech debt, so we can proceed at maximum velocity. Senior en
Everyone can "up their game" by being diligent about this until it becomes second nature; and by reading books on it.

The following sections elaborate:

- [What does clean code look like?](#what-does-clean-code-look-like)
- [Benefits of clean code](#benefits-of-clean-code)
- [Reading list](#reading-list)
Expand All @@ -39,16 +41,19 @@ The following sections elaborate:

**No DRY violations.** This alone avoids many complexity issues.

**Great labels.** This makes a huge difference to complexity and DX too.
- Rule of thumb: be as specific as possible, while keeping character count sane. Eg "ohlcv_data_factory" vs "parquet_data_factory".
- In functions of 2-5 lines you can get away with super-short labels like "c" as long as it's local, and the context makes it obvious.
**Great labels.** This makes a huge difference to complexity and DX too.

- Rule of thumb: be as specific as possible, while keeping character count sane. Eg "ohlcv_data_factory" vs "parquet_data_factory".
- In functions of 2-5 lines you can get away with super-short labels like "c" as long as it's local, and the context makes it obvious.
- Generally: "how easily can a developer understand the code?" Think of yourself that's writing for readers, where the reader is developers (including yourself).

**Dynamic type-checking**, via @enforce_typing + type hints on variables.

- Given that bugs often show up as type violations, think of dynamic type-checking as a robot that automatically hunts down those bugs on your behalf. Doing dynamic type-checking will save you a ton of time.
- Small exception: in 2% of cases it's overkill, eg if your type is complex or if you're fighting with mock or mypy; then skip it there.
- Small exception: in 2% of cases it's overkill, eg if your type is complex or if you're fighting with mock or mypy; then skip it there.

**Tik-tok, refactor-add.** That is: for many features, it's best to spend 90% of the effort to refactor first (and merge that PR). Then the actual change for the feature itself is near-trivial, eg 10% of the effort.

- It's "tik tok", where "tik" = refactor (unit tests typically don't change), and "tok" = make change.
- Inspiration: this is Intel's approach to CPU development, where "tik" = change manufacturing process, "tok" = change CPU design.

Expand All @@ -57,6 +62,7 @@ The following sections elaborate:
**Does it pass the smell test?** If you feel like your code smells, you have work to do.

**Have tests always. Use TDD.**

- Coverage should be >90%.
- You should be using test-driven development (TDD), ie write the tests at the same time as the code itself in very rapid cycles of 30 s - 5 min. The outcome: module & test go together like a hand & glove.
- Anti-pattern outcome: tests are very awkward, having to jump through hoops to do tests.
Expand Down Expand Up @@ -84,14 +90,13 @@ To "up your game", here are great books on software engineering, in order to rea
- Code Complete 2, by Steve McConnell. Classic book on code construction, filled to the brim with practical tips. [Link](https://www.goodreads.com/book/show/4845.Code_Complete)
- Clean Code: A Handbook of Agile Software Craftsmanship, by Robert C. Martin. [Link](https://www.goodreads.com/book/show/3735293-clean-code)
- A Philosophy of Software Design, by John Osterhout. Best book on managing complexity. Empasizes DRY. If you've been in the coding trenches for a while, this feels like a breath of fresh air and helps you to up your game further. [Link](https://www.goodreads.com/book/show/39996759-a-philosophy-of-software-design).
- Refactoring: Improving the Design of Existing Code, by Martiwn Fowler. This book is a big "unlock" on how to apply refactoring everywhere like a ninja. [Link](https://www.goodreads.com/book/show/44936.Refactoring).
- Refactoring: Improving the Design of Existing Code, by Martiwn Fowler. This book is a big "unlock" on how to apply refactoring everywhere like a ninja. [Link](https://www.goodreads.com/book/show/44936.Refactoring).
- Head First Design Patterns, by Eric Freeman et al. Every good SW engineer should have design patterns in their toolbox. This is a good first book on design patterns. [Link](https://www.goodreads.com/book/show/58128.Head_First_Design_Patterns)
- Design Patterns: Elements of Reusable Object-Oriented Software, by GOF. This is "the bible" on design patterns. It's only so-so on approachability, but nonetheless the content makes it worth it. But start with "Head First Design Patterns". [Link](https://www.goodreads.com/book/show/85009.Design_Patterns)
- The Pragmatic Programmer: From Journeyman to Master, by Andy Hunt and Dave Thomas. Some people really love this, I found it so-so. But definitely worth a read to round out your SW engineering. https://www.goodreads.com/book/show/4099.The_Pragmatic_Programmer
- Design Patterns: Elements of Reusable Object-Oriented Software, by GOF. This is "the bible" on design patterns. It's only so-so on approachability, but nonetheless the content makes it worth it. But start with "Head First Design Patterns". [Link](https://www.goodreads.com/book/show/85009.Design_Patterns)
- The Pragmatic Programmer: From Journeyman to Master, by Andy Hunt and Dave Thomas. Some people really love this, I found it so-so. But definitely worth a read to round out your SW engineering. [Link](https://www.goodreads.com/book/show/4099.The_Pragmatic_Programmer)

A final one. In general, _when you're coding, you're writing_. Therefore, books on crisp writing are also books about coding (!). The very top of this list is [Strunk & White Elements of Style](https://www.goodreads.com/book/show/33514.The_Elements_of_Style). It's sharper than a razor blade.


## Recap

Each PR should always be both "make it work" _and_ "make it good (clean)". ✨
Expand Down
Loading

0 comments on commit 3919ac2

Please sign in to comment.