Skip to content

Commit c1b79e9

Browse files
committed
Fix typos and grammar in md files
1 parent 5730b46 commit c1b79e9

12 files changed

+64
-61
lines changed

docs/app/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ You can switch which way many of these components are run by setting the `PY_RUN
6060
* `export PY_RUN_APPROACH=local` will run these components natively
6161
* `export PY_RUN_APPROACH=docker` will run these within Docker
6262

63-
Note that even with the native mode, many components like the DB and API will only ever run in Docker, and you should always make sure that any implementations work within docker.
63+
Note that even with the native mode, many components like the DB and API will only ever run in Docker, and you should always make sure that any implementations work within Docker.
6464

6565
Running in the native/local approach may require additional packages to be installed on your machine to get working.
6666

@@ -71,8 +71,8 @@ Running in the native/local approach may require additional packages to be insta
7171
* Run `poetry install --all-extras --with dev` to keep your Poetry packages up to date
7272
* Load environment variables from the local.env file, see below for one option.
7373

74-
One option for loading all of your local.env variables is to install direnv: https://direnv.net/
75-
You can configure direnv to then load the local.env file by creating an `.envrc` file in the /app directory that looks like:
74+
One option for loading all of your local.env variables is to install `direnv`: https://direnv.net/
75+
You can configure `direnv` to then load the local.env file by creating an `.envrc` file in the /app directory that looks like:
7676

7777
```sh
7878
#!/bin/bash
@@ -98,7 +98,7 @@ direnv: export +API_AUTH_TOKEN +AWS_ACCESS_KEY_ID +AWS_DEFAULT_REGION +AWS_SECRE
9898

9999
Most configuration options are managed by environment variables.
100100

101-
Environment variables for local development are stored in the [local.env](/app/local.env) file. This file is automatically loaded when running. If running within Docker, this file is specified as an `env_file` in the [docker-compose](/docker-compose.yml) file, and loaded [by a script](/app/src/util/local.py) automatically when running unit tests (see running natively above for other cases).
101+
Environment variables for local development are stored in the [local.env](/backend/local.env) file. This file is automatically loaded when running. If running within Docker, this file is specified as an `env_file` in the [docker-compose](/backend/docker-compose.yml) file, and loaded [by a script](/backend/src/util/local.py) automatically when running unit tests (see running natively above for other cases).
102102

103103
Any environment variables specified directly in the [docker-compose](/docker-compose.yml) file will take precedent over those specified in the [local.env](/app/local.env) file.
104104

docs/app/database/database-management.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -33,8 +33,8 @@ To clean the database, use the following command:
3333
make db-recreate
3434
```
3535

36-
This will remove _all_ docker project volumes, rebuild the database volume, and
37-
run all pending migrations. Once completed, only the database container will be
36+
This will remove _all_ docker project volumes, rebuild the database volume, and
37+
run all pending migrations. Once completed, only the database container will be
3838
running. Simply run `make start` to bring up all other project containers.
3939

4040
## Running migrations
@@ -100,7 +100,7 @@ make db-migrate-history
100100

101101
When multiple migrations are created that point to the same `down_revision` a
102102
branch is created, with the tip of each branch being a "head". The above history
103-
command will show this, but a list of just the heads can been retrieved with:
103+
command will show this, but a list of just the heads can be retrieved with:
104104

105105
```sh
106106
make db-migrate-heads

docs/app/database/database-testing.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ This document describes how the database is managed in the test suite.
44

55
## Test Schema
66

7-
The test suite creates a new PostgreSQL database schema separate from the `public` schema that is used by the application outside of testing. This schema persists throughout the testing session is dropped at the end of the test run. The schema is created by the `db` fixture in [conftest.py](../../../app/tests/conftest.py). The fixture also creates and returns an initialized instance of the [db.DBClient](../../../app/src/db/__init__.py) that can be used to connect to the created schema.
7+
The test suite creates a new PostgreSQL database schema separate from the `public` schema that is used by the application outside of testing. This schema persists throughout the testing session and is dropped at the end of the test run. The schema is created by the `db` fixture in [conftest.py](../../../app/tests/conftest.py). The fixture also creates and returns an initialized instance of the [db.DBClient](../../../app/src/db/__init__.py) that can be used to connect to the created schema.
88

99
Note that [PostgreSQL schemas](https://www.postgresql.org/docs/current/ddl-schemas.html) are entirely different concepts from [Schema objects in OpenAPI specification](https://swagger.io/docs/specification/data-models/).
1010

docs/app/getting-started.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ A very simple [docker-compose.yml](/docker-compose.yml) has been included to sup
1919
curl -sSL https://install.python-poetry.org | python3 -
2020
```
2121

22-
3. If you are using an M1 mac, you will need to install postgres as well: `brew install postgresql` (The psycopg2-binary is built from source on M1 macs which requires the postgres executable to be present)
22+
3. If you are using an M1 Mac, you will need to install Postgres as well: `brew install postgresql` (The psycopg2-binary is built from source on M1 Macs which requires the Postgres executable to be present)
2323

2424
4. You'll also need [Docker Desktop](https://www.docker.com/products/docker-desktop/)
2525

docs/app/monitoring-and-observability/logging-configuration.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ This document describes how logging is configured in the application. The loggin
88

99
We have two separate ways of formatting the logs which are controlled by the `LOG_FORMAT` environment variable.
1010

11-
`json` (default) -> Produces JSON formatted logs which are machine-readable.
11+
`json` (default) -> Produces JSON formatted logs, which are machine-readable.
1212

1313
```json
1414
{
@@ -27,7 +27,7 @@ We have two separate ways of formatting the logs which are controlled by the `LO
2727
}
2828
```
2929

30-
`human-readable` (set by default in `local.env`) -> Produces color coded logs for local development or for troubleshooting.
30+
`human-readable` (set by default in `local.env`) -> Produces color-coded logs for local development or troubleshooting.
3131

3232
![Human readable logs](human-readable-logs.png)
3333

@@ -37,11 +37,11 @@ The [src.logging.flask_logger](../../../app/src/logging/flask_logger.py) module
3737

3838
## PII Masking
3939

40-
The [src.logging.pii](../../../app/src/logging/pii.py) module defines a filter that applies to all logs that automatically masks data fields that look like social security numbers.
40+
The [src.logging.pii](../../../app/src/logging/pii.py) module defines a filter that applies to all logs and automatically masks data fields that look like social security numbers.
4141

4242
## Audit Logging
4343

44-
* The [src.logging.audit](../../../app/src/logging/audit.py) module defines a low level audit hook that logs events that may be of interest from a security point of view, such as dynamic code execution and network requests.
44+
* The [src.logging.audit](../../../app/src/logging/audit.py) module defines a low-level audit hook that logs events that may be of interest from a security point of view, such as dynamic code execution and network requests.
4545

4646
## Additional Reading
4747

docs/app/monitoring-and-observability/logging-conventions.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Logging is a valuable tool for engineering teams to support products in producti
88

99
### Make code observability a primary tool for debugging and reasoning about production code
1010

11-
When a user runs into an issue in production, logs offer one of the primary ways of understanding what happened. This is especially important for situations where we can’t or don’t know how to reproduce the issue. In general it is not feasible to attach a debugger to production systems, or to set breakpoints and inspect the state of the application in production, so logs offer a way to debug through “print statements”.
11+
When a user runs into an issue in production, logs offer one of the primary ways of understanding what happened. This is especially important for situations where we can’t or don’t know how to reproduce the issue. In general, it is not feasible to attach a debugger to production systems, or to set breakpoints and inspect the state of the application in production, so logs offer a way to debug through “print statements”.
1212

1313
### Make it easy for on-call engineers to search for logs in the codebase
1414

@@ -30,21 +30,21 @@ Log querying systems are often limited in their querying abilities. Most log dat
3030

3131
### Log event type
3232

33-
- **INFO** – Use INFO events to log something informational. This can be information that's useful for investigations, debugging, or tracking metrics. Note that events such as a user or client error (such as validation errors or 4XX bad request errors) should use INFO, since those are expected to occur as part of normal operation and do not necessarily indicate anything wrong with the system. Do not use ERROR or WARNING for user or client errors to avoid cluttering error logs.
34-
- **ERROR** – Use ERROR events if the the system is failed to complete some business operation. This can happen if there is an unexpected exception or failed assertion. Error logs can be used to trigger an alert to on-call engineers to look into a potential issue.
35-
- **WARNING** – Use WARNING to indicate that there *may* be something wrong with the system but that we have not yet detected any immediate impact on the system's ability to successfully complete the business operation. For example, you can warn on failed soft assumptions and soft constraints. Warning logs can be used to trigger notifications that engineers need to look into during business hours.
33+
- **INFO** – Use `INFO` events to log something informational. This can be information that's useful for investigations, debugging, or tracking metrics. Note that events such as a user or client error (such as validation errors or 4XX bad request errors) should use `INFO`, since those are expected to occur as part of normal operation and do not necessarily indicate anything wrong with the system. Do not use `ERROR` or `WARNING` for user or client errors to avoid cluttering error logs.
34+
- **ERROR** – Use `ERROR` events if the system fails to complete some business operation. This can happen if there is an unexpected exception or failed assertion. Error logs can be used to trigger an alert to on-call engineers to look into a potential issue.
35+
- **WARNING** – Use `WARNING` to indicate that there *may* be something wrong with the system but that we have not yet detected any immediate impact on the system's ability to successfully complete the business operation. For example, you can warn on failed soft assumptions and soft constraints. Warning logs can be used to trigger notifications that engineers need to look into during business hours.
3636

3737
### Log messages
3838

39-
- **Standardized log messages** – Consistently formatted and worded log messages easier to read when viewing many logs at a time, which reduces the chance for human error when interpreting logs. It also makes it easier to write queries by enabling engineers to guess queries and allow New Relic autocomplete to show available log message options to filter by.
39+
- **Standardized log messages** – Consistently formatted and worded log messages are easier to read when viewing many logs at a time, which reduces the chance of human error when interpreting logs. It also makes it easier to write queries by enabling engineers to guess queries and allowing New Relic autocomplete to show available log message options to filter by.
4040
- **Statically defined log messages** – Avoid putting dynamic data in log messages. Static messages are easier to search for in the codebase. Static messages are also easier to query for those specific log events without needing to resort to RLIKE queries with regular expressions or LIKE queries.
4141

4242
### Attributes
4343

4444
- **Log primitives not objects** – Explicitly list which attributes you are logging to avoid unintentionally logging PII. This also makes it easier for engineers to know what attributes are available for querying, or for engineers to search for parts of the codebase that logs these attributes.
4545
- **Structured metadata in custom attributes** – Put metadata in custom attributes (not in the log message) so that it can be used in queries more easily. This is especially helpful when the attributes are used in "group by" clauses to avoid needing to use more complicated queries.
4646
- **system identifiers** – Log all relevant system identifiers (uuids, foreign keys)
47-
- **correlation ids** – Log ids that can be shared between frontend events, backend logs, and ideally even sent to external services
47+
- **correlation ids** – Log ids that can be shared between front-end events, backend logs, and ideally even sent to external services
4848
- **discrete or discretized attributes** – Log all useful non-PII discrete attributes (enums, flags) and discretized versions of continuous attributes (e.g. comment → has_comment, household → is_married, has_dependents)
4949
- **Denormalized data** – Include relevant metadata from related entities. Including denormalized (i.e. redundant) data makes queries easier and faster, and removes the need to join or self-join between datasets, which is not always feasible.
5050
- **Fully-qualified globally consistent attribute names** – Using consistent attribute names everywhere. Use fully qualified attribute names (e.g. application.application_id instead of application_id) to avoid naming conflicts.

docs/decisions/0000-use-markdown-architectural-decision-records.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Which format and structure should these records follow?
1818
Chosen option: "MADR 2.1.2", because
1919

2020
* Implicit assumptions should be made explicit.
21-
Design documentation is important to enable people understanding the decisions later on.
21+
Design documentation is important to enable people to understand the decisions later on.
2222
See also [A rational design process: How and why to fake it](https://doi.org/10.1109/TSE.1986.6312940).
2323
* The MADR format is lean and fits our development style.
2424
* The MADR structure is comprehensible and facilitates usage & maintenance.

0 commit comments

Comments
 (0)