Skip to content

Part 1 — +"Breaking changes and deprecation" page under releases to support the new processes #650

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

validbeck
Copy link
Collaborator

@validbeck validbeck commented Feb 6, 2025

Internal Notes for Reviewers

sc-8307

To start the journey for our new refined processes, I have added a page under "Releases":

Breaking changes & deprecations

LIVE PREVIEW

  • The rationale is as a user, I want to be able to easily locate breaking changes and deprecations without skimming through what might end up being hundreds of (sometimes very long!) releases for the right topic.
  • This page provides a streamlined timeline as well, for both external and internal reference.
  • The interactive history tables provide additional context in the form of links to the actual release announcement, any supplementary blog posts, and a clearly defined feature removal date.

How it works

R script (site/releases/breaking-changes/breaking_changes.R) with some additional JavaScript for filtering, turning some .csvs into interactive tables. The .qmd page simply calls the reusable functions.

Why did I choose this path?

  • Being able to easily filter by product area, version, and type is something users would definitely appreciate, especially as we start to scale and this page actually becomes more than two entries.
  • Authoring experience: Markdown tables are a pain in the behind. I thought about having to manually add an entry each time and my soul died inside. CSVs are way easier to edit manually, and we can contain the entries by year. Also this way, we can further automate the process: Refer to PART 2

🎉 Documented here: Breaking changes and deprecation guide

I also added this note in the main repo README:

Screenshot 2025-02-07 at 1 58 51 PM

You can take a look at this demo PR here to see some filtering in action for a fake 2025 table with more data: #652

n.b.
  • I tested this with the docker build and it works flawlessly there too.
  • R environment also added to the workflows so the online render runs thanks to the new composite action setup-R:
    - name: Setup R environment
    • But Beck, I can already hear you say, I'm sure this complexity adds runtime! I thought of that — the action caches the environment so the "high cost" is only really once. Proof it works:
1st workflow run Subsequent workflow runs
1st-run 2nd-run

Part 2

I can also already hear you telling me that adding a row to a CSV is still too much work. I agree. This is just the setup for stuff to come.

Part two (and maybe three, depending on how far I get?) will include:

  • Wrapping a new breaking-changes tag into the current release script/notebook
  • (If I can figure it out) A script/notebook to automatically create the yearly history CSVs from the template file, plunk the associated code cell on the history page if the associated year doesn't already exist, insert a new row with your deprecation/breaking change at the top into the yearly CSV as long as you provide the same information when the input dialogue prompts you.
    • I've already set up the .qmd file including a marker with in mind:

To me, being able to edit a row on in an isolated CSV file is already greatly superior to having to manually type out a ^*&(ing markdown table row, because you can never count the columns properly and ...... etc., not to mention this has way more functionality than a normal table.

There's much more involvement than just "spit out a list" as well — including the collateral, identifying the associated release version, etc. This is just all part of the "release storytelling" clean-up to me, which is where the human SHOULD be in the loop to some extent.

External Release Notes

We've introduced an interactive historical record of breaking changes and deprecations to the ValidMind AI risk platform to better inform users about important updates to our product.

In addition to announcement and planned obsolescence dates, this history provides easy access to any relevant messaging associated with the change, such as version release notes and blog posts. These histories grouped by calendar year are searchable and can be filtered by product area (ValidMind Library, ValidMind Platform), associated versioning, and the type of change.

@validbeck validbeck added the internal Not to be externalized in the release notes label Feb 6, 2025
@validbeck validbeck self-assigned this Feb 6, 2025
@validbeck validbeck marked this pull request as draft February 6, 2025 20:10
Copy link
Contributor

github-actions bot commented Feb 6, 2025

A PR preview is available: Preview URL

Copy link
Contributor

github-actions bot commented Feb 6, 2025

A PR preview is available: Preview URL

@validbeck validbeck marked this pull request as ready for review February 6, 2025 22:31
@nrichers
Copy link
Collaborator

nrichers commented Feb 6, 2025

  • The rationale is as a user, I want to be able to easily locate breaking changes and deprecations without skimming through what might end up being hundreds of (sometimes very long!) releases for the right topic.

I would automate this: breaking changes need to be in the release notes, but we should be able to generate a list of breaking changes in one location that you can refer to.

Also note, there are both breaking changes and deprecations we need to track — the process needs to be defined for both.

In GitHub this starts with the labels:

  • deprecation — Already exists and has been used
  • breaking-change — Doesn't exist yet, needs to be scooped up by the release notes

@validbeck
Copy link
Collaborator Author

I would automate this: breaking changes need to be in the release notes, but we should be able to generate a list of breaking changes in one location that you can refer to.

I'm still working on this why are you here reviewing when it wasn't requested yet! 😆

@validbeck validbeck marked this pull request as draft February 6, 2025 23:38
Copy link
Contributor

github-actions bot commented Feb 7, 2025

A PR preview is available: Preview URL

Copy link
Contributor

github-actions bot commented Feb 7, 2025

A PR preview is available: Preview URL

Copy link
Contributor

github-actions bot commented Feb 7, 2025

A PR preview is available: Preview URL

@validbeck validbeck changed the title Added a "Breaking changes" page under releases to support the new deprecation process Part ` — +"Breaking changes and deprecation" page under releases to support the new processes Feb 7, 2025
@validbeck validbeck changed the title Part ` — +"Breaking changes and deprecation" page under releases to support the new processes Part 1 — +"Breaking changes and deprecation" page under releases to support the new processes Feb 7, 2025
Copy link
Contributor

github-actions bot commented Feb 7, 2025

A PR preview is available: Preview URL

Copy link
Contributor

github-actions bot commented Feb 7, 2025

A PR preview is available: Preview URL

Copy link
Contributor

github-actions bot commented Feb 7, 2025

A PR preview is available: Preview URL

@validbeck validbeck requested a review from nrichers February 7, 2025 23:18
Copy link
Contributor

github-actions bot commented Feb 7, 2025

PR Summary

This pull request introduces a new GitHub Action to set up an R environment and install necessary R packages for rendering interactive tables in the documentation. The action is defined in .github/actions/setup-r/action.yml and includes steps to set up R, cache R packages, install required R packages (DT, readr, stringr, lubridate), and verify the R installation.

The PR also updates several GitHub workflows (deploy-docs-prod.yaml, deploy-docs-staging.yaml, validate-docs-site.yaml) to include the new R setup step, ensuring that the R environment is prepared before rendering documentation.

Additionally, the documentation is enhanced with a new section on breaking changes and deprecations. This includes a guide (site/releases/breaking-changes/README.md) on how to add breaking changes or deprecations to the history, and a new Quarto document (site/releases/breaking-changes/breaking-changes.qmd) to display these changes as interactive tables. Supporting R scripts (site/releases/breaking-changes/breaking_changes.R) and example CSV files are also added to facilitate this functionality.

Test Suggestions

  • Verify that the R setup action correctly installs R and the specified packages on different operating systems.
  • Test the caching mechanism to ensure R packages are cached and restored correctly across workflow runs.
  • Check that the interactive tables render correctly in the documentation site using the installed R packages.
  • Validate that the breaking changes and deprecations guide is clear and accurate by following the steps to add a new entry.
  • Ensure that the Quarto document correctly displays the interactive tables with the expected data from CSV files.

Copy link
Collaborator

@nrichers nrichers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! 🚀 Let's add this Part 1 for now as a great step towards having a dedicated page for breaking changes and deprecations.

Assorted comments

How closely related to https://quarto.org/docs/dashboards/data-display.html#dt is your solution? They both use DT to which you add some packages to parse the CSV and display the table with filtering, but I wonder if a render_table function in the breaking_changes.R script is actually necessary if you can just use DT::datatable out of the box?

A script/notebook to automatically create the yearly history CSVs from the template file

The easiest approach might be to add a step to our existing release notes notebook: Grab the info out of the PRs and add it to a new CSV if it doesn't exist already whenever the first deprecation/breaking change gets announced, with a message to update the details. Reviewing that file then becomes part of the release notes process and should include our PM.

You'll need the links for the release notes and any blog posts associated with the breaking change or deprecation to include in the table, so make sure those are published that first unless you want to fill them in at a later date.

For automation, I think we should link to the release notes only. A blog post link will always be manual — if we have the link, it can go into the releases notes, where the context is also explained.

Nice work!

@validbeck
Copy link
Collaborator Author

validbeck commented Feb 10, 2025

How closely related to https://quarto.org/docs/dashboards/data-display.html#dt is your solution? They both use DT to which you add some packages to parse the CSV and display the table with filtering, but I wonder if a render_table function in the breaking_changes.R script is actually necessary if you can just use DT::datatable out of the box?

This is where I started, but the filtering options out of the box didn't quite work:

... I THOUGHT I had sent a commit up and could revert to show you the screencap but I didn't, apparently I only sent it up after it started to work the way I expected. You'll just have to believe me. 💀

Stuff was greyed out and the filters that I wanted weren't usable (I assume because we're not actually feeding it "data" in the way that it expected).

The easiest approach might be to add a step to our existing release notes notebook: Grab the info out of the PRs and add it to a new CSV if it doesn't exist already whenever the first deprecation/breaking change gets announced, with a message to update the details. Reviewing that file then becomes part of the release notes process and should include our PM.

I'm going to disagree here, you need the finalised title of the feature in the releases and the release notes to be published, so adding stuff to this table should be part of the wrap-up of the storytelling and not generated at the beginning. It's also WAY easier to troubleshoot an automation if the parts are granular; it'll be easier for me to write the script/notebook if I don't have to worry about the complexity of the existing setup for example.

For automation, I think we should link to the release notes only. A blog post link will always be manual — if we have the link, it can go into the releases notes, where the context is also explained.

Cool, I will remove this from the table. I wasn't sure on this one myself so I'm fine with taking it out and just making sure it goes in the release notes:

Screenshot 2025-02-10 at 10 11 59 AM

(Also clarified the definition for breaking changes.)

Copy link
Contributor

A PR preview is available: Preview URL

@validbeck validbeck merged commit b1a8b3c into main Feb 10, 2025
3 checks passed
@validbeck validbeck deleted the beck/sc-8307/draft-a-breaking-changes-deprecation-process branch February 10, 2025 18:25
@nrichers
Copy link
Collaborator

Stuff was greyed out and the filters that I wanted weren't usable (I assume because we're not actually feeding it "data" in the way that it expected).

Your comment reminded me of something I pondered while reviewing your PR: how often do we expect to publish deprecations or breaking changes that they will require filtering? So far, we’ve added exactly one thing in two years, though there were other early deprecations and removals that went unmentioned.

Put differently, having this filtering ability now is great — it prepares us for the day when we need it — but it’s currently a solution for which we might not have a problem for several years to come (it’s still great to have, all the same!). This contrasts sharply with other content sets I’ve managed, where multiple related products had their own release notes, each with breaking changes and deprecations.

I'm going to disagree here, you need the finalised title of the feature in the releases and the release notes to be published, so adding stuff to this table should be part of the wrap-up of the storytelling and not generated at the beginning.

Fair enough, but let’s keep the manual work as low as possible. This process will require regular human input. If we can integrate it with something else—and release notes are a direct touchpoint for deprecations and breaking changes—then we’re better off. Editing a generated piece of text or a row in a CSV is no different from editing the release notes we already generate.

@validbeck
Copy link
Collaborator Author

This contrasts sharply with other content sets I’ve managed, where multiple related products had their own release notes, each with breaking changes and deprecations.

Yes, I also think this is the better solution, BUT, we don't currently do that today. Right now we sort of jumble everything together, hence our huge releases that need so much TLC. Some of the issue though, is that our two "halves" are integrated. Sometimes we release a new feature in the library that has a visual component that shows up in the platform, for example.

Editing a generated piece of text or a row in a CSV is no different from editing the release notes we already generate.

I agree on a technical level, but again, since I'm the one building this currently and I have a vague idea of where my skill limit is (low), I'm going to start with what's doable and we can improve from there instead of aiming for the sky and becoming frustrated with not being able to get there.

nrichers added a commit that referenced this pull request Feb 14, 2025
* Adding latest `RawData` notebook to docs (#647)

* Pulling in latest RawData notebook

* Added notebook to testing overview

* Added notebook to FAQ - testing

* feat: make lighter version of docs-site target

* Fix notebook link (#651)

* Part 1 — +"Breaking changes and deprecation" page under releases to support the new processes (#650)

* Adding a breaking changes page

* Details

* Editing headers

* Adding product area

* Folder

* Testing R table

* It woooorks

* Setup for multiple entires & adding search back

* Adjusting test data for clarity

* Single-sourcing the functions

* Moving functions to scri[pt for cleaner exp

* Column widths back in

* Cleanup & templates

* Templating README

* Tweaking

* Adding version and type columns

* Filters for version & type

* Examples and more testing

* Intro to README

* Moved history into its own folder

* Instructions for +year

* Add an entry draft

* Entry example table

* Testing R setup

* Switching placement

* Forgot a package

* Cleanup & caching

* Forgot a package again

* Adding R env to staging & prod flows

* Naming the code cells for output cleanliness

* Removed fake 2025 example

* Wow, typo

* Wording adjustment

* Removed blog posts

* Fix notebook link (#651) (#654)

Co-authored-by: Nik Richers <nik@validmind.ai>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>

* Define next iteration of training content (#649)

* Rough in some topics

* WIP content noodling

* Fix merge conflict

* Content rework

* Content edits

* Add mapping from questions to courses

* Content edits

* Updated content, move videos

* Update learning paths

* Update learning paths

* Expand learning paths

* Update learning paths, FAQ

* Edits

* Update course content

* Update FAQ

* Fix margin callout, shorten sample training plan

* Minor edits

* Add docs links to FAQ

* Update FAQ and learning paths

* Training FAQ updates

* Clean up temporary CSS, add small button CSS

* Hide .attn sections to be able to publish the page

* Update FAQ page formatting, add Pandas link

* Add some FAQ-elements

* Best offer for formatting an shenanigans

* Hide more working notes stuff, add .attn to learning path course cards

* More learning paths formatting improvements

* Simplify language and remove old file

* Add callouts to learning paths

* Modifying some style stuff

* Redoing buttons for readability

* Quick tweak to sidebar

* Overview rename & cleanup

* Reverting sidebar & toc for rest of site

* Update site/training/program/sample-training-plan.qmd

Co-authored-by: Beck <164545837+validbeck@users.noreply.github.com>

* Training sidebar tweaks

* Remove extra newline and commented out line

* Improve 'coming soon' text

---------

Co-authored-by: Beck <164545837+validbeck@users.noreply.github.com>

---------

Co-authored-by: Beck <164545837+validbeck@users.noreply.github.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Spencer Krum <nibz@validmind.ai>
Co-authored-by: Spencer Krum <nibz@spencerkrum.com>
@validbeck validbeck added documentation Improvements or additions to documentation and removed internal Not to be externalized in the release notes labels Mar 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants