Skip to content

Commit 7372156

Browse files
authored
Merge branch 'master' into google-stt
2 parents a3acf26 + ab3c124 commit 7372156

File tree

91 files changed

+130
-124
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

91 files changed

+130
-124
lines changed

.github/CONTRIBUTING.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ tell Poetry to use the virtualenv python environment (`poetry config virtualenvs
7777

7878
There are two separate projects in this repository:
7979
- `langchain`: core langchain code, abstractions, and use cases
80-
- `langchain.experimental`: see the [Experimental README](../libs/experimental/README.md) for more information.
80+
- `langchain.experimental`: see the [Experimental README](https://github.com/langchain-ai/langchain/tree/master/libs/experimental/README.md) for more information.
8181

8282
Each of these has its own development environment. Docs are run from the top-level makefile, but development
8383
is split across separate test & release flows.
@@ -129,7 +129,7 @@ To run unit tests in Docker:
129129
make docker_tests
130130
```
131131

132-
There are also [integration tests and code-coverage](../libs/langchain/tests/README.md) available.
132+
There are also [integration tests and code-coverage](https://github.com/langchain-ai/langchain/tree/master/libs/langchain/tests/README.md) available.
133133

134134
### Formatting and Linting
135135

docs/.local_build.sh

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ cd ../_dist
1414
poetry run python scripts/model_feat_table.py
1515
poetry run nbdoc_build --srcdir docs
1616
cp ../cookbook/README.md src/pages/cookbook.mdx
17+
cp ../.github/CONTRIBUTING.md docs/contributing.md
1718
poetry run python scripts/generate_api_reference_links.py
1819
yarn install
1920
yarn start

docs/docs/get_started/quickstart.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ import CodeBlock from "@theme/CodeBlock";
1818
</Tabs>
1919

2020

21-
For more details, see our [Installation guide](/docs/get_started/installation.html).
21+
For more details, see our [Installation guide](/docs/get_started/installation).
2222

2323
## Environment setup
2424

docs/docs/guides/deployments/index.mdx

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -20,11 +20,11 @@ This guide aims to provide a comprehensive overview of the requirements for depl
2020

2121
Understanding these components is crucial when assessing serving systems. LangChain integrates with several open-source projects designed to tackle these issues, providing a robust framework for productionizing your LLM applications. Some notable frameworks include:
2222

23-
- [Ray Serve](/docs/ecosystem/integrations/ray_serve.html)
23+
- [Ray Serve](/docs/ecosystem/integrations/ray_serve)
2424
- [BentoML](https://github.com/bentoml/BentoML)
25-
- [OpenLLM](/docs/ecosystem/integrations/openllm.html)
26-
- [Modal](/docs/ecosystem/integrations/modal.html)
27-
- [Jina](/docs/ecosystem/integrations/jina.html#deployment)
25+
- [OpenLLM](/docs/ecosystem/integrations/openllm)
26+
- [Modal](/docs/ecosystem/integrations/modal)
27+
- [Jina](/docs/ecosystem/integrations/jina#deployment)
2828

2929
These links will provide further information on each ecosystem, assisting you in finding the best fit for your LLM deployment needs.
3030

docs/docs/integrations/callbacks/argilla.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
"> using both human and machine feedback. We provide support for each step in the MLOps cycle, \n",
1515
"> from data labeling to model monitoring.\n",
1616
"\n",
17-
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/hwchase17/langchain/blob/master/docs/integrations/callbacks/argilla.html\">\n",
17+
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/hwchase17/langchain/blob/master/docs/integrations/callbacks/argilla\">\n",
1818
" <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
1919
"</a>"
2020
]

docs/docs/integrations/chat_loaders/langsmith_dataset.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
"id": "a9ab2a39-7c2d-4119-9dc7-8035fdfba3cb",
66
"metadata": {},
77
"source": [
8-
"# Fine-Tuning on LangSmith Chat Datasets\n",
8+
"# LangSmith Chat Datasets\n",
99
"\n",
1010
"This notebook demonstrates an easy way to load a LangSmith chat dataset fine-tune a model on that data.\n",
1111
"The process is simple and comprises 3 steps.\n",
@@ -271,7 +271,7 @@
271271
"name": "python",
272272
"nbconvert_exporter": "python",
273273
"pygments_lexer": "ipython3",
274-
"version": "3.11.2"
274+
"version": "3.9.1"
275275
}
276276
},
277277
"nbformat": 4,

docs/docs/integrations/chat_loaders/langsmith_llm_runs.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
"id": "a9ab2a39-7c2d-4119-9dc7-8035fdfba3cb",
66
"metadata": {},
77
"source": [
8-
"# Fine-Tuning on LangSmith LLM Runs\n",
8+
"# LangSmith LLM Runs\n",
99
"\n",
1010
"This notebook demonstrates how to directly load data from LangSmith's LLM runs and fine-tune a model on that data.\n",
1111
"The process is simple and comprises 3 steps.\n",
@@ -421,7 +421,7 @@
421421
"name": "python",
422422
"nbconvert_exporter": "python",
423423
"pygments_lexer": "ipython3",
424-
"version": "3.11.2"
424+
"version": "3.9.1"
425425
}
426426
},
427427
"nbformat": 4,

docs/docs/integrations/document_loaders/apify_dataset.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
"\n",
1414
"## Prerequisites\n",
1515
"\n",
16-
"You need to have an existing dataset on the Apify platform. If you don't have one, please first check out [this notebook](/docs/integrations/tools/apify.html) on how to use Apify to extract content from documentation, knowledge bases, help centers, or blogs."
16+
"You need to have an existing dataset on the Apify platform. If you don't have one, please first check out [this notebook](/docs/integrations/tools/apify) on how to use Apify to extract content from documentation, knowledge bases, help centers, or blogs."
1717
]
1818
},
1919
{

docs/docs/integrations/document_loaders/pandas_dataframe.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
"source": [
88
"# Pandas DataFrame\n",
99
"\n",
10-
"This notebook goes over how to load data from a [pandas](https://pandas.pydata.org/pandas-docs/stable/user_guide/index.html) DataFrame."
10+
"This notebook goes over how to load data from a [pandas](https://pandas.pydata.org/pandas-docs/stable/user_guide/index) DataFrame."
1111
]
1212
},
1313
{

docs/docs/integrations/document_loaders/psychic.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,10 @@
55
"metadata": {},
66
"source": [
77
"# Psychic\n",
8-
"This notebook covers how to load documents from `Psychic`. See [here](/docs/ecosystem/integrations/psychic.html) for more details.\n",
8+
"This notebook covers how to load documents from `Psychic`. See [here](/docs/ecosystem/integrations/psychic) for more details.\n",
99
"\n",
1010
"## Prerequisites\n",
11-
"1. Follow the Quick Start section in [this document](/docs/ecosystem/integrations/psychic.html)\n",
11+
"1. Follow the Quick Start section in [this document](/docs/ecosystem/integrations/psychic)\n",
1212
"2. Log into the [Psychic dashboard](https://dashboard.psychic.dev/) and get your secret key\n",
1313
"3. Install the frontend react library into your web app and have a user authenticate a connection. The connection will be created using the connection id that you specify."
1414
]

0 commit comments

Comments
 (0)