Standardize KV-Store Docs #24888
Labels
🤖:docs
Changes to documentation and examples, like .md, .rst, .ipynb files. Changes to the docs/ folder
help wanted
Good issue for contributors
To make our KV-store integrations as easy to use as possible we need to make sure the docs for them are thorough and standardized. There are two parts to this: updating the KV-store docstrings and updating the actual integration docs.
This needs to be done for each KV-store integration, ideally with one PR per KV-store.
Related to broader issues #21983 and #22005.
Docstrings
Each KV-store class docstring should have the sections shown in the Appendix below. The sections should have input and output code blocks when relevant.
To build a preview of the API docs for the package you're working on run (from root of repo):
make api_docs_clean; make api_docs_quick_preview API_PKG=openai
where
API_PKG=
should be the parent directory that houses the edited package (e.g. community, openai, anthropic, huggingface, together, mistralai, groq, fireworks, etc.). This should be quite fast for all the partner packages.Doc pages
Each KV-store docs page should follow this template.
Here is an example: https://python.langchain.com/v0.2/docs/integrations/stores/in_memory/
You can use the
langchain-cli
to quickly get started with a new chat model integration docs page (run from root of repo):poetry run pip install -e libs/cli poetry run langchain-cli integration create-doc --name "foo-bar" --name-class FooBar --component-type kv_store --destination-dir ./docs/docs/integrations/stores/
where
--name
is the integration package name without the "langchain-" prefix and--name-class
is the class name without the "ByteStore" suffix. This will create a template doc with some autopopulated fields at docs/docs/integrations/stores/foo_bar.ipynb.To build a preview of the docs you can run (from root):
make docs_clean make docs_build cd docs/build/output-new yarn yarn start
Appendix
Expected sections for the KV-store class docstring.
The text was updated successfully, but these errors were encountered: