Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
leo-gan committed Aug 13, 2024
1 parent 089f5e6 commit 30139c4
Show file tree
Hide file tree
Showing 11 changed files with 250 additions and 76 deletions.
72 changes: 72 additions & 0 deletions docs/docs/integrations/llms/rwkv.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# RWKV-4

>[RWKV](https://www.rwkv.com/) (pronounced RwaKuv) language model is an RNN
> with GPT-level LLM performance,
> and it can also be directly trained like a GPT transformer (parallelizable).
>
>It's combining the best of RNN and transformer - great performance, fast inference,
> fast training, saves VRAM, "infinite" ctxlen, and free text embedding.
> Moreover it's 100% attention-free, and a LFAI project.

## Installation and Setup

- Install the Python `rwkv` and `tokenizer` packages

```bash
pip install rwkv tokenizer
```

- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
- Download a [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)

### Rwkv-4 models recommended VRAM

| Model | 8bit | bf16/fp16 | fp32 |
|-------|------|-----------|------|
| 14B | 16GB | 28GB | >50GB |
| 7B | 8GB | 14GB | 28GB |
| 3B | 2.8GB| 6GB | 12GB |
| 1b5 | 1.3GB| 3GB | 6GB |

See the [rwkv pip](https://pypi.org/project/rwkv/) page for more information about strategies,
including streaming and CUDA support.

## Usage

### RWKV

To use the RWKV wrapper, you need to provide the path to the pre-trained model file and the tokenizer's configuration.
```python
from langchain_community.llms import RWKV

# Test the model

```python

def generate_prompt(instruction, input=None):
if input:
return f"""Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
# Instruction:
{instruction}
# Input:
{input}
# Response:
"""
else:
return f"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
# Instruction:
{instruction}
# Response:
"""


model = RWKV(model="./models/RWKV-4-Raven-3B-v7-Eng-20230404-ctx4096.pth", strategy="cpu fp32", tokens_path="./rwkv/20B_tokenizer.json")
response = model.invoke(generate_prompt("Once upon a time, "))
```

41 changes: 18 additions & 23 deletions docs/docs/integrations/platforms/microsoft.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -237,6 +237,8 @@ See a [usage example](/docs/integrations/document_loaders/microsoft_onenote).
from langchain_community.document_loaders.onenote import OneNoteLoader
```

## Vectorstores

### Playwright URL Loader

>[Playwright](https://github.com/microsoft/playwright) is an open-source automation tool
Expand Down Expand Up @@ -271,8 +273,6 @@ Below are two available Azure Cosmos DB APIs that can provide vector store funct
> You can apply your MongoDB experience and continue to use your favorite MongoDB drivers, SDKs, and tools by pointing your application to the API for MongoDB vCore account's connection string.
> Use vector search in Azure Cosmos DB for MongoDB vCore to seamlessly integrate your AI-based applications with your data that's stored in Azure Cosmos DB.
#### Installation and Setup

See [detail configuration instructions](/docs/integrations/vectorstores/azure_cosmos_db).

We need to install `pymongo` python package.
Expand All @@ -281,14 +281,6 @@ We need to install `pymongo` python package.
pip install pymongo
```

#### Deploy Azure Cosmos DB on Microsoft Azure

Azure Cosmos DB for MongoDB vCore provides developers with a fully managed MongoDB-compatible database service for building modern applications with a familiar architecture.

With Cosmos DB for MongoDB vCore, developers can enjoy the benefits of native Azure integrations, low total cost of ownership (TCO), and the familiar vCore architecture when migrating existing applications or building new ones.

[Sign Up](https://azure.microsoft.com/en-us/free/) for free to get started today.

See a [usage example](/docs/integrations/vectorstores/azure_cosmos_db).

```python
Expand All @@ -299,12 +291,7 @@ from langchain_community.vectorstores import AzureCosmosDBVectorSearch

>[Azure Cosmos DB for NoSQL](https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/vector-search) now offers vector indexing and search in preview.
This feature is designed to handle high-dimensional vectors, enabling efficient and accurate vector search at any scale. You can now store vectors
directly in the documents alongside your data. This means that each document in your database can contain not only traditional schema-free data,
but also high-dimensional vectors as other properties of the documents. This colocation of data and vectors allows for efficient indexing and searching,
as the vectors are stored in the same logical unit as the data they represent. This simplifies data management, AI application architectures, and the
efficiency of vector-based operations.

#### Installation and Setup
directly in the documents alongside your data.

See [detail configuration instructions](/docs/integrations/vectorstores/azure_cosmos_db_no_sql).

Expand All @@ -314,20 +301,14 @@ We need to install `azure-cosmos` python package.
pip install azure-cosmos
```

#### Deploy Azure Cosmos DB on Microsoft Azure

Azure Cosmos DB offers a solution for modern apps and intelligent workloads by being very responsive with dynamic and elastic autoscale. It is available
in every Azure region and can automatically replicate data closer to users. It has SLA guaranteed low-latency and high availability.

[Sign Up](https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/quickstart-python?pivots=devcontainer-codespace) for free to get started today.

See a [usage example](/docs/integrations/vectorstores/azure_cosmos_db_no_sql).

```python
from langchain_community.vectorstores import AzureCosmosDBNoSQLVectorSearch
```

## Retrievers

### Azure AI Search

>[Azure AI Search](https://learn.microsoft.com/en-us/azure/search/search-what-is-azure-search) (formerly known as `Azure Search` or `Azure Cognitive Search` ) is a cloud search service that gives developers infrastructure, APIs, and tools for building a rich search experience over private, heterogeneous content in web, mobile, and enterprise applications.
Expand Down Expand Up @@ -445,6 +426,20 @@ See a [usage example](/docs/integrations/tools/playwright).
from langchain_community.agent_toolkits import PlayWrightBrowserToolkit
```

## Memory

### Azure CosmosDB Chat Message History

We need to install a python package.

```bash
pip install azure-cosmos
```

```python
from langchain_community.chat_message_histories import CosmosDBChatMessageHistory
```

## Graphs

### Azure Cosmos DB for Apache Gremlin
Expand Down
8 changes: 8 additions & 0 deletions docs/docs/integrations/platforms/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,14 @@ See a [usage example](/docs/integrations/tools/dalle_image_generator).
from langchain_community.utilities.dalle_image_generator import DallEAPIWrapper
```

### ChatGPT Plugins

See a [usage example](/docs/integrations/tools/chatgpt_plugins).

```python
from langchain_community.tools import AIPluginTool
```

## Adapter

See a [usage example](/docs/integrations/adapters/openai).
Expand Down
24 changes: 24 additions & 0 deletions docs/docs/integrations/providers/aerospike.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Aerospike

>[Aerospike Vector Search](https://aerospike.com/docs/vector) (AVS) is an extension to
> the `Aerospike Database` that enables searches across very large datasets stored in `Aerospike`.
> This new service lives outside of `Aerospike` and builds an index to perform those searches.

## Installation and Setup

You need to have a running `AVS` instance. Use one of the [installation methods](https://aerospike.com/docs/vector/install).

You need to install `aerospike-vector-search` python package.

```bash
pip install aerospike-vector-search
```

## Vectorstore

See a [usage example](/docs/integrations/vectorstores/aerospike).

```python
from langchain_community.vectorstores import Aerospike
```
7 changes: 7 additions & 0 deletions docs/docs/integrations/providers/ai21.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,13 @@ serving as a context, and a question and return an answer based entirely on this
from langchain_ai21 import AI21ContextualAnswers
```

### AI21 Community

```python
from langchain_community.llms import AI21
```



## Chat models

Expand Down
16 changes: 16 additions & 0 deletions docs/docs/integrations/providers/ainetwork.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,22 @@ You need to install `ain-py` python package.
pip install ain-py
```
You need to set the `AIN_BLOCKCHAIN_ACCOUNT_PRIVATE_KEY` environmental variable to your AIN Blockchain Account Private Key.

## Tools

Tools that help you interact with the `AINetwork` blockchain. They are all included
in the `AINetworkToolkit` toolkit.

See a [usage example](/docs/integrations/toolkits/ainetwork).

```python
from langchain_community.tools import AINAppOps
from langchain_community.tools import AINOwnerOps
from langchain_community.tools import AINRuleOps
from langchain_community.tools import AINTransfer
from langchain_community.tools import AINValueOps
```

## Toolkit

See a [usage example](/docs/integrations/tools/ainetwork).
Expand Down
34 changes: 34 additions & 0 deletions docs/docs/integrations/providers/amadeus.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Amadeus

>[Amadeus Travel APIs](https://developers.amadeus.com/). Get instant access to over 400 airlines, 150,000 hotels, 300,000 tours & activities.
## Installation and Setup

To use the `Amadeus` integration, you need to have an `API key` from `Amadeus`.
See [instructions here](https://developers.amadeus.com/get-started/get-started-with-self-service-apis-335).

We have to install the `amadeus` python package:

```bash
pip install amadeus
```

## Tools

Tools that help you interact with the `Amadeus travel APIs`. They are all included
in the `Amadeus` toolkit.

See a [usage example](/docs/integrations/toolkits/amadeus).

```python
from langchain_community.tools.amadeus import AmadeusClosestAirport
from langchain_community.tools.amadeus import AmadeusFlightSearch
```

## Toolkit

See a [usage example](/docs/integrations/toolkits/amadeus).

```python
from langchain_community.agent_toolkits.amadeus.toolkit import AmadeusToolkit
```
27 changes: 27 additions & 0 deletions docs/docs/integrations/providers/mongodb_motor.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# MongoDB Motor

>[MongoDB](https://www.mongodb.com/) is a source-available, cross-platform, document-oriented
> database program. Classified as a `NoSQL` database product, `MongoDB` utilizes JSON-like
> documents with optional schemas.
>
> [Motor](https://pypi.org/project/motor/) is a full-featured, non-blocking `MongoDB` driver
> for Python `asyncio` and `Tornado` applications. `Motor` presents a coroutine-based
> API for non-blocking access to MongoDB.
## Installation and Setup

We need to set up the configuratin parameters for the MongoDB database. See instructions [here](/docs/integrations/document_loaders/mongodb/).

We also need to install `motor` python package.

```bash
pip install motor
```

## Document Loader

See a [usage example](/docs/integrations/document_loaders/mongodb/).

```python
from langchain_community.document_loaders.mongodb import MongodbLoader
```
66 changes: 13 additions & 53 deletions docs/docs/integrations/providers/rwkv.mdx
Original file line number Diff line number Diff line change
@@ -1,65 +1,25 @@
# RWKV-4

This page covers how to use the `RWKV-4` wrapper within LangChain.
It is broken into two parts: installation and setup, and then usage with an example.
>[RWKV](https://www.rwkv.com/) (pronounced RwaKuv) language model is an RNN
> with GPT-level LLM performance,
> and it can also be directly trained like a GPT transformer (parallelizable).
## Installation and Setup
- Install the Python package with `pip install rwkv`
- Install the tokenizer Python package with `pip install tokenizer`
- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
- Download the [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)

## Usage

### RWKV

To use the RWKV wrapper, you need to provide the path to the pre-trained model file and the tokenizer's configuration.
```python
from langchain_community.llms import RWKV

# Test the model

```python

def generate_prompt(instruction, input=None):
if input:
return f"""Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.

# Instruction:
{instruction}
- Install the Python `rwkv` and `tokenizer` packages

# Input:
{input}
# Response:
"""
else:
return f"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
# Instruction:
{instruction}
# Response:
"""


model = RWKV(model="./models/RWKV-4-Raven-3B-v7-Eng-20230404-ctx4096.pth", strategy="cpu fp32", tokens_path="./rwkv/20B_tokenizer.json")
response = model.invoke(generate_prompt("Once upon a time, "))
```bash
pip install rwkv tokenizer
```
## Model File
- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
- Download a [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)

You can find links to model file downloads at the [RWKV-4-Raven](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) repository.
## LLMs

### Rwkv-4 models -> recommended VRAM
### RWKV

See a [usage example](/docs/integrations/llms/rwkv).

```python
from langchain_community.llms import RWKV
```
RWKV VRAM
Model | 8bit | bf16/fp16 | fp32
14B | 16GB | 28GB | >50GB
7B | 8GB | 14GB | 28GB
3B | 2.8GB| 6GB | 12GB
1b5 | 1.3GB| 3GB | 6GB
```

See the [rwkv pip](https://pypi.org/project/rwkv/) page for more information about strategies, including streaming and cuda support.
Loading

0 comments on commit 30139c4

Please sign in to comment.