Skip to content

Merge main into staging #451

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Sep 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions site/_quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -249,6 +249,8 @@ website:
contents:
- guide/monitoring/enable-monitoring.qmd
- guide/monitoring/review-monitoring-results.qmd
- text: "Metrics over time"
file: guide/monitoring/work-with-metrics-over-time.qmd

- title: "Developer Framework"
contents:
Expand Down
20 changes: 10 additions & 10 deletions site/developer/model-documentation/work-with-test-results.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -20,28 +20,28 @@ Once generated via the {{< var validmind.developer >}}, view and add the test re

{{< include /guide/model-documentation/_add-content-block.qmd >}}

5. Click {{< fa square-plus >}} and then select **Test-Driven Block**.[^3]
5. Click {{< fa square-plus >}} and then select **Test-Driven**.[^3]

- By default, the `Developer` role can add test-driven blocks within model documentation.
- By default, the `Developer` role can add test-driven blocks within model documentation or ongoing-monitoring plans.
- By default, the `Validator` role can add test-driven blocks within validation reports.

6. Select test results using one of these methods:
6. Select test results:

- Select the tests to insert into the model documentation from the list of available tests.
- Search by name using **{{<fa magnifying-glass >}} Search** on the top-left.
- Search by name using **{{<fa magnifying-glass >}} Search** on the top-left to locate specific test results.

![](test-driven-block-menu.png){width=85% fig-alt="A screenshot showing several test-driven blocks that have been selected for insertion into the model documentation" class="screenshot"}
![Test-driven blocks that have been selected for insertion](test-driven-block-menu.png){width=85% fig-alt="A screenshot showing several test-driven blocks that have been selected for insertion" class="screenshot"}

To preview what is included in a test, select it. By default, selected tests are previewed.
To preview what is included in a test, click on it. By default, the actively selected test is reviewed.

7. Click **Insert # Test Results to Document** when you are ready.

8. After inserting the results into your documentation, click on the text to make changes or add comments.[^4]
8. After inserting the results into your document, click on the text to make changes or add comments.[^4]


## View test result metadata

After you have added a test result to your documentation, you can view the following information attached to the result:
After you have added a test result to your document, you can view the following information attached to the result:

- History of values for the result
- What users wrote those results
Expand All @@ -51,7 +51,7 @@ After you have added a test result to your documentation, you can view the follo

2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^5]

3. In the left sidebar that appears for your model, click **Documentation** or **Validation Report**.
3. In the left sidebar that appears for your model, click **Documentation**, **Validation Report**, or **Ongoing Monitoring**.

4. Locate the test result whose metadata you want to view.

Expand All @@ -60,7 +60,7 @@ After you have added a test result to your documentation, you can view the follo
- On the test result timeline, click on the **{{< fa chevron-down >}}** associated with a test run to expand for details.
- When you are done, you can either click **Cancel** or **{{< fa x >}}** to close the metadata menu.

![](test-run-details.gif){width=85% fig-alt="A gif showcasing detail expansion of test runs on the test result timeline" class="screenshot"}
![Detail expansion of test runs on the test result timeline](test-run-details.gif){width=85% fig-alt="A gif showcasing detail expansion of test runs on the test result timeline" class="screenshot"}


## What's next
Expand Down
2 changes: 1 addition & 1 deletion site/guide/model-documentation/_add-content-block.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

1. In the left sidebar that appears for your model, click **Documentation**, **Validation Report**, or **Ongoing Monitoring**.

You can now jump to any section of the model documentation or validation report by expanding the table of contents on the left and selecting the relevant section you would like to add content to, such as **1.1 Model Overview**.
You can now jump to any section of the model documentation, validation report, or ongoing monitoring plan by expanding the table of contents on the left and selecting the relevant section you would like to add content to, such as **1.1 Model Overview**.

1. Hover your mouse over the space where you want your new block to go until a horizontal dashed line with a {{< fa square-plus >}} sign appears that indicates you can insert a new block:

Expand Down
28 changes: 15 additions & 13 deletions site/guide/model-documentation/work-with-content-blocks.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -53,11 +53,11 @@ Use ValidMind to assist you with generating content via AI!^[[Generate with AI (

5. Click {{< fa square-plus >}} and then select one of the available options:

- **Text Block** — Adds a new section with a blank content block. After the new content block has been added, click {{< fa pencil >}} to edit the contents of the section like any other.
- **Test-Driven Block**[^5] — Adds a new section with test results.
- **Metric Over Time** — Adds a new section with metric over time results.
- **Text** — Adds a new section with a blank content block. After the new content block has been added, click {{< fa pencil >}} to edit the contents of the section like any other.
- **Test-Driven**[^5] — Adds a new section with test results.
- **Metric Over Time**[^6] — Adds a new section with metric over time results.

6. After adding the block to your documentation, click on the text to make changes or add comments.[^6]
6. After adding the block to your documentation, click on the text to make changes or add comments.[^7]

### Add mathematical formulas

Expand Down Expand Up @@ -88,19 +88,19 @@ While editing a simple text block, you can have ValidMind assist you with genera
4. After you insert the AI-generated draft, click on the text box to make the necessary edits and adjustments to your copy:

- Ensure that content is in compliance with the quality guidelines outlined by your organization.
- Use the content editing toolbar[^7] just as you would with any other text block.
- Use the content editing toolbar[^8] just as you would with any other text block.

![Generating content with AI within a simple text block](generate-with-ai.gif){width=100% fig-alt="An animation that showcases the Generate with AI feature within a simple text block" class="screenshot"}

::: {.callout}
When generating content drafts with AI, accepted versions and edits are retained in your Model Activity[^8] just like other updates to your documentation, reports, or plans.
When generating content drafts with AI, accepted versions and edits are retained in your Model Activity[^9] just like other updates to your documentation, reports, or plans.
:::

## Remove content blocks

1. In the left sidebar, click **Model Inventory**.

2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^9]
2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^10]

3. In the left sidebar that appears for your model, click **Documentation**, **Validation Report**, or **Ongoing Monitoring**.

Expand All @@ -114,15 +114,15 @@ When generating content drafts with AI, accepted versions and edits are retained

![Deleting a content block via the text toolbar](remove-text-block.gif){fig-alt="A gif showing the process of deleting a content block via the text toolbar" class="screenshot"}

- In the single-button toolbar for the test-driven block
- In the single-button toolbar for test-driven or metric over time blocks

![Deleting a content block via the single-button toolbar for test-driven blocks](remove-test-driven-block.gif){fig-alt="A gif showing the process of deleting a content block via the single-button toolbar for test-driven blocks" class="screenshot"}


6. After you confirm, the content block is removed.

::: {.callout-important}
Test-driven blocks can be re-added later on but **text blocks are currently deleted permanently**.
Test-driven or metric over time blocks can be re-added later on but **text blocks are currently deleted permanently**.
:::

## What's next
Expand All @@ -145,10 +145,12 @@ When generating content drafts with AI, accepted versions and edits are retained

[^5]: [Work with test results](/developer/model-documentation/work-with-test-results.qmd)

[^6]: [Collaborate with others](/guide/model-documentation/collaborate-with-others.qmd)
[^6]: [Work with metrics over time](/guide/monitoring/work-with-metrics-over-time.qmd)

[^7]: [Content editing toolbar](#content-editing-toolbar)
[^7]: [Collaborate with others](/guide/model-documentation/collaborate-with-others.qmd)

[^8]: [View model activity](/guide/model-inventory/view-model-activity.qmd)
[^8]: [Content editing toolbar](#content-editing-toolbar)

[^9]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)
[^9]: [View model activity](/guide/model-inventory/view-model-activity.qmd)

[^10]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)
19 changes: 12 additions & 7 deletions site/guide/monitoring/enable-monitoring.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Enable monitoring with two steps:
## Prerequisites

- [x] {{< var link.login >}}
- [x] You are a `Developer` or `Validator`, or assigned another role with sufficient permissions to perform the tasks in this guide.[^1]
- [x] You are a `Developer` or assigned another role with sufficient permissions to perform the tasks in this guide.[^1]
- [x] The model is in the pre-approval stage for performance testing or the model has been approved and is in production.

:::
Expand All @@ -40,9 +40,12 @@ To enable ongoing monitoring for a model, add `monitoring=True` to your code sni
2. Copy the code snippet for the model:

- In the left sidebar that appears for your model, click **{{< fa rocket >}} Getting Started**.
- Locate the code snippet and click **{{< fa regular copy >}} Copy snippet to clipboard**.
- Locate the code snippet and set **[enable ongoing monitoring]{.smallcaps}** to true.
- Click **{{< fa regular copy >}} Copy snippet to clipboard**.

3. Paste the code snippet into your code and add `monitoring=True` to the `vm.init` method, similar to this example:
3. Paste the snippet into your development source code.[^4]

Confirm that `monitoring=True` is present in the `vm.init` method, similar to this example:

```python
import validmind as vm
Expand All @@ -62,7 +65,7 @@ Before you can start sending ongoing monitoring data from your developer environ

1. In the {{< var validmind.platform >}}, click **Model Inventory** in the left sidebar.

2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^4]
2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^5]

3. In the left sidebar, click **{{< fa desktop >}} Ongoing Monitoring**.

Expand All @@ -87,7 +90,7 @@ APIRequestError: Please select an ongoing monitoring template on the ValidMind p

## What's next

After you have enabled ongoing monitoring and run your code to generate some output, you can start reviewing the monitoring results.[^5]
After you have enabled ongoing monitoring and run your code to generate some output, you can start reviewing the monitoring results.[^6]


<!-- FOOTNOTES -->
Expand All @@ -98,6 +101,8 @@ After you have enabled ongoing monitoring and run your code to generate some out

[^3]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)

[^4]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)
[^4]: [Install and initialize the client library](/developer/model-documentation/install-and-initialize-client-library.qmd)

[^5]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)

[^5]: [Review monitoring results](review-monitoring-results.qmd)
[^6]: [Review monitoring results](review-monitoring-results.qmd)
Binary file added site/guide/monitoring/metric-over-time-data.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added site/guide/monitoring/metrics-over-time-menu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion site/guide/monitoring/ongoing-monitoring.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,17 @@ date: last-modified
listing:
- id: ongoing-monitoring-listing
type: grid
grid-columns: 2
max-description-length: 250
sort: false
fields: [title, description]
contents:
- enable-monitoring.qmd
- review-monitoring-results.qmd
- work-with-metrics-over-time.qmd
- ../../notebooks/code_samples/ongoing_monitoring/quickstart_customer_churn_ongoing_monitoring.ipynb
- id: ongoing-monitoring
contents: "../../tests/ongoing_monitoring/**"
contents: "../../tests/ongoing_monitoring/*.md"
type: grid
grid-columns: 2
max-description-length: 250
Expand Down
2 changes: 1 addition & 1 deletion site/guide/monitoring/review-monitoring-results.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ To ensure your model continues to perform as expected, it's important to regular
## Prerequisites

- [x] {{< var link.login >}}
- [x] You are a `Developer` or `Validator`, or assigned another role with sufficient permissions to perform the tasks in this guide.[^1]
- [x] You are a `Developer` or assigned another role with sufficient permissions to perform the tasks in this guide.[^1]
- [x] The model is in the pre-approval stage for performance testing or the model has been approved and is in production.

:::
Expand Down
96 changes: 96 additions & 0 deletions site/guide/monitoring/work-with-metrics-over-time.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
---
title: "Work with metrics over time"
date: last-modified
---

Once generated via the {{< var validmind.developer >}}, view and add metrics over time to your ongoing monitoring plans in the {{< var validmind.platform >}}.

Metrics over time refers to the continued monitoring of a model's performance once it is deployed. Tracking how a model performs as new data is introduced or conditions change ensures that it remains accurate and reliable in real-world environments where data distributions or market conditions shift.

- Model performance is determined by continuously measuring metrics and comparing them over time to detect degradation, bias, or shifts in the model's output.
- Performance data is collected and tracked over time, often using a rolling window approach or real-time monitoring tools with the same metrics used in testing, but observed across different periods.
- Continuous tracking helps to identify if and when a model needs to be recalibrated, retrained, or even replaced due to performance deterioration or changing conditions.

::: {.prereq}

## Prerequisites

- [x] {{< var link.login >}}
- [x] Metrics over time have already been logged via the {{< var validmind.developer >}} for your model.[^1]
- [x] You are a `Developer` or assigned another role with sufficient permissions to perform the tasks in this guide.[^2]

:::

## Add metrics over time

1. In the left sidebar, click **Model Inventory**.

2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^3]

3. In the left sidebar that appears for your model, click **Documentation** or **Ongoing Monitoring**.

You can now jump to any section of the model documentation or ongoing monitoring plan by expanding the table of contents on the left and selecting the relevant section you would like to add content to, such as **1.1 Model Monitoring Performance History**.

4. Hover your mouse over the space where you want your new block to go until a horizontal dashed line with a {{< fa square-plus >}} sign appears that indicates you can insert a new block:

![Adding a content block in the UI](/guide/model-documentation/add-content-block.gif){width=90% fig-alt="A gif showing the process of adding a content block in the UI" class="screenshot"}

5. Click {{< fa square-plus >}} and then select **Metric Over Time**.[^4]

By default, only the `Developer` role can add metrics over time within model documentation or ongoing monitoring plans.

6. Select metric over time results:

- Select the metric over time to insert into the model documentation from the list of available metrics.
- Search by name using **{{<fa magnifying-glass >}} Search** on the top-left to locate specific metrics.

![Metric over time blocks that have been selected for insertion](metrics-over-time-menu.png){width=85% fig-alt="A screenshot showing several metric over time blocks that have been selected for insertion" class="screenshot"}

To preview what is included in a metric, click on it. By default, the actively selected metric is previewed.

7. Click **Insert # Metrics(s) Over Time to Document** when you are ready.

8. After inserting the metrics into your document, review the data to confirm that it is accurate and relevant.


## View metric over time metadata

After you have added metrics over time to your document, you can view the following information attached to the result:

- Date and time the metric was recorded
- Who updated the metric
- The numeric value of the metric
- Any additional parameters

1. In the left sidebar, click **Model Inventory**.

2. Select a model by clicking on it or find your model by applying a filter or searching for it.[^5]

3. In the left sidebar that appears for your model, click **Documentation** or **Ongoing Monitoring**.

4. Locate the metric whose metadata you want to view.

5. Under the metric's name, click on **Data** tab.

![](metric-over-time-data.png){width=85% fig-alt="A screenshot showing the Data tab within a metric over time" class="screenshot"}


## What's next

- [Working with model documentation](/guide/model-documentation/working-with-model-documentation.qmd)
- [Work with content blocks](/guide/model-documentation/work-with-content-blocks.qmd)
- [Collaborate with others](/guide/model-documentation/collaborate-with-others.qmd)
- [View model activity](/guide/model-inventory/view-model-activity.qmd)


<!-- FOOTNOTES -->

[^1]: [Intro to Unit Metrics](/notebooks/how_to/run_unit_metrics.ipynb)

[^2]: [Manage permissions](/guide/configuration/manage-permissions.qmd)

[^3]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)

[^4]: [Work with content blocks](/guide/model-documentation/work-with-content-blocks.qmd)

[^5]: [Working with the model inventory](/guide/model-inventory/working-with-model-inventory.qmd#search-filter-and-sort-models)