Skip to content

Commit

Permalink
Rename
Browse files Browse the repository at this point in the history
  • Loading branch information
jaredpalmer committed Jun 1, 2023
1 parent ae1d117 commit 161587e
Show file tree
Hide file tree
Showing 21 changed files with 5,052 additions and 1,768 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# AI Utils
# AI Connector

AI Utils is **a compact library for building edge-rendered AI-powered streaming text and chat UIs**.
AI Connector is **a compact library for building edge-rendered AI-powered streaming text and chat UIs**.

## Features

Expand All @@ -12,15 +12,15 @@ AI Utils is **a compact library for building edge-rendered AI-powered streaming
## Quick Start

```sh
pnpm install @vercel/ai-utils
pnpm install ai-connector
```

## Usage

```tsx
// ./app/api/chat/route.ts
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from '@vercel/ai-utils'
import { OpenAIStream, StreamingTextResponse } from 'ai-connector'

const config = new Configuration({
apiKey: process.env.OPENAI_API_KEY
Expand All @@ -44,7 +44,7 @@ export async function POST() {
// ./app/page.tsx
'use client'

import { useChat } from '@vercel/ai-utils'
import { useChat } from 'ai-connector'

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat()
Expand Down
2 changes: 1 addition & 1 deletion apps/docs/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
"start": "next start "
},
"dependencies": {
"@vercel/ai-utils": "workspace:*",
"ai-connector": "workspace:*",
"@vercel/analytics": "^1.0.1",
"cobe": "^0.6.3",
"next": "13.4.4-canary.9",
Expand Down
14 changes: 7 additions & 7 deletions apps/docs/pages/api-reference.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ An SWR-powered React hook for streaming chat messages and handling chat and prom
```tsx filename="app/chat.tsx"
'use client'

import { useChat } from '@vercel/ai-utils'
import { useChat } from 'ai-connector'

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat()
Expand Down Expand Up @@ -58,7 +58,7 @@ An SWR-powered React hook for streaming text completion and handling prompt inpu
```tsx filename="app/completion.tsx"
'use client'

import { useCompletion } from '@vercel/ai-utils'
import { useCompletion } from 'ai-connector'

export default function Completion() {
const {
Expand Down Expand Up @@ -111,7 +111,7 @@ A transform that will extract the text from all chat and completion OpenAI model

```tsx filename="app/api/chat/route.ts"
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from '@vercel/ai-utils'
import { OpenAIStream, StreamingTextResponse } from 'ai-connector'

const config = new Configuration({
apiKey: process.env.OPENAI_API_KEY
Expand Down Expand Up @@ -149,7 +149,7 @@ It expects the iterable `AsyncGenerator` from HuggingFace Inference SDK's `hf.te

```tsx filename="app/api/chat/route.ts"
import { HfInference } from '@huggingface/inference'
import { HuggingFaceStream, StreamingTextResponse } from '@vercel/ai-utils'
import { HuggingFaceStream, StreamingTextResponse } from 'ai-connector'

export const runtime = 'edge'

Expand Down Expand Up @@ -179,10 +179,10 @@ Returns a `stream` and bag of [LangChain](js.langchain.com/docs) `BaseCallbackHa

#### Example

Here is a reference implementation of a chat endpoint that uses both AI Utils and LangChain together with Next.js App Router
Here is a reference implementation of a chat endpoint that uses both AI Connector and LangChain together with Next.js App Router

```tsx filename="app/api/chat/route.ts"
import { StreamingTextResponse, LangChainStream } from '@vercel/ai-utils'
import { StreamingTextResponse, LangChainStream } from 'ai-connector'
import { ChatOpenAI } from 'langchain/chat_models/openai'
import { AIChatMessage, HumanChatMessage } from 'langchain/schema'
import { CallbackManager } from 'langchain/callbacks'
Expand Down Expand Up @@ -218,7 +218,7 @@ This is a tiny wrapper around `Response` class that makes returning `ReadableStr

```tsx
// app/api/generate/route.ts
import { OpenAIStream, StreamingTextResponse } from '@vercel/ai-utils'
import { OpenAIStream, StreamingTextResponse } from 'ai-connector'

export const runtime = 'edge'

Expand Down
4 changes: 2 additions & 2 deletions apps/docs/pages/api/og.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -300,7 +300,7 @@ export default async function handler(request) {
letterSpacing: '-0.03em'
}}
>
<b>{title || 'Vercel AI Utils'}</b>
<b>{title || 'Vercel AI Connector'}</b>
</div>
{title ? (
<div
Expand All @@ -310,7 +310,7 @@ export default async function handler(request) {
marginTop: 20
}}
>
<b>Vercel AI Utils</b>
<b>Vercel AI Connector</b>
</div>
) : null}
</div>
Expand Down
16 changes: 8 additions & 8 deletions apps/docs/pages/getting-started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,21 +11,21 @@ Inside your Next.js project directory, run the following:
<Tab>

```bash
pnpm add @vercel/ai-utils
pnpm add ai-connector
```

</Tab>
<Tab>

```bash
npm i @vercel/ai-utils
npm i ai-connector
```

</Tab>
<Tab>

```bash
yarn add @vercel/ai-utils
yarn add ai-connector
```

</Tab>
Expand All @@ -39,12 +39,12 @@ For this tutorial, we'll build a streaming AI chatbot app with OpenAI's `gpt-3.5

### Create a Next.js app

Create a Next.js application and install `@vercel/ai-utils` and `openai-edge`. We currently prefer the latter `openai-edge` library over the official OpenAI SDK because the official SDK uses `axios` which is not compatible with Vercel Edge Functions.
Create a Next.js application and install `ai-connector` and `openai-edge`. We currently prefer the latter `openai-edge` library over the official OpenAI SDK because the official SDK uses `axios` which is not compatible with Vercel Edge Functions.

```sh
pnpx create-next-app my-ai-app
cd my-ai-app
pnpm install @vercel/ai-utils openai-edge
pnpm install ai-connector openai-edge
```

### Add your OpenAI API Key to `.env`
Expand All @@ -65,7 +65,7 @@ Create a Next.js Route Handler that uses the Edge Runtime that we'll use to gene

```tsx filename="app/api/chat/route.ts" showLineNumbers
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from '@vercel/ai-utils'
import { OpenAIStream, StreamingTextResponse } from 'ai-connector'

// Create an OpenAI API client (that's edge friendly!)
const config = new Configuration({
Expand Down Expand Up @@ -93,7 +93,7 @@ export async function POST(req: Request) {
}
```

Vercel AI Utils provides 2 utility helpers to make the above seamless: First, we pass the streaming `response` we receive from OpenAI to `OpenAIStream`. This method decodes/extracts the text tokens in the response and then re-encodes them properly for simple consumption. We can then pass that new stream directly to `StreamingTextResponse`. This is another utility class that extends the normal Node/Edge Runtime `Response` class with the default headers you probably want (hint: `'Content-Type': 'text/plain; charset=utf-8'` is already set for you).
Vercel AI Connector provides 2 utility helpers to make the above seamless: First, we pass the streaming `response` we receive from OpenAI to `OpenAIStream`. This method decodes/extracts the text tokens in the response and then re-encodes them properly for simple consumption. We can then pass that new stream directly to `StreamingTextResponse`. This is another utility class that extends the normal Node/Edge Runtime `Response` class with the default headers you probably want (hint: `'Content-Type': 'text/plain; charset=utf-8'` is already set for you).

### Wire up the UI

Expand All @@ -103,7 +103,7 @@ By default, the `useChat` hook will use the `POST` Route Handler we created abov
```tsx filename="app/page.tsx" showLineNumbers
'use client'

import { useChat } from '@vercel/ai-utils'
import { useChat } from 'ai-connector'

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat()
Expand Down
2 changes: 1 addition & 1 deletion apps/docs/pages/guides.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Guides

Vercel AI Utils is compatible many popular AI/model providers. This section contains guides for using Vercel AI Utils with these providers inside of Next.js.
Vercel AI Connector is compatible many popular AI/model providers. This section contains guides for using Vercel AI Connector with these providers inside of Next.js.

- [OpenAI](./guides/openai)
- [Anthropic](./guides/anthropic)
Expand Down
6 changes: 3 additions & 3 deletions apps/docs/pages/guides/langchain.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@ However, LangChain does not provide a way to easily build UIs or a standard way

## Example

Here is an example implementation of a chat application that uses both AI Utils and LangChain's [OpenAIChat](https://js.langchain.com/docs/api/llms_openai/classes/OpenAIChat) together with [Next.js](https://nextjs.org/docs) App Router. It uses AI Utils' [`LangChainStream`](../api-reference#langchainstream) to stream text to the client (from the edge) and then AI Utils' `useChat` to handle the chat UI.
Here is an example implementation of a chat application that uses both AI Connector and LangChain's [OpenAIChat](https://js.langchain.com/docs/api/llms_openai/classes/OpenAIChat) together with [Next.js](https://nextjs.org/docs) App Router. It uses AI Connector's [`LangChainStream`](../api-reference#langchainstream) to stream text to the client (from the edge) and then AI Connector's `useChat` to handle the chat UI.

```tsx filename="app/api/chat/route.ts" {1,10,27}
import { StreamingTextResponse, LangChainStream } from '@vercel/ai-utils'
import { StreamingTextResponse, LangChainStream } from 'ai-connector'
import { ChatOpenAI } from 'langchain/chat_models/openai'
import { AIChatMessage, HumanChatMessage } from 'langchain/schema'
import { CallbackManager } from 'langchain/callbacks'
Expand Down Expand Up @@ -52,7 +52,7 @@ The result is that AI Util's [`useChat`](../api-reference#usechat) and [`useComp
```tsx filename="app/page.tsx"
'use client'

import { useChat } from '@vercel/ai-utils'
import { useChat } from 'ai-connector'

export default function Chat() {
const { messages, input, isLoading, handleInputChange, handleSubmit } =
Expand Down
37 changes: 21 additions & 16 deletions apps/docs/pages/guides/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,20 +2,20 @@ import { Steps } from 'nextra-theme-docs'

# OpenAI

Vercel AI Utils provides a set of utilities to make it easy to use OpenAI's API. In this guide, we'll walk through how to use the utilities to create a chat bot and a text completion app.
Vercel AI Connector provides a set of utilities to make it easy to use OpenAI's API. In this guide, we'll walk through how to use the utilities to create a chat bot and a text completion app.

## Guide: Chat Bot

<Steps>

### Create a Next.js app

Create a Next.js application and install `@vercel/ai-utils` and `openai-edge`. We currently prefer the latter `openai-edge` library over the official OpenAI SDK because the official SDK uses `axios` which is not compatible with Vercel Edge Functions.
Create a Next.js application and install `ai-connector` and `openai-edge`. We currently prefer the latter `openai-edge` library over the official OpenAI SDK because the official SDK uses `axios` which is not compatible with Vercel Edge Functions.

```sh
pnpx create-next-app my-ai-app
cd my-ai-app
pnpm install @vercel/ai-utils openai-edge
pnpm install ai-connector openai-edge
```

### Add your OpenAI API Key to `.env`
Expand All @@ -34,7 +34,7 @@ For this example, we'll create a route handler at `app/api/chat/route.ts` that a

```tsx filename="app/api/chat/route.ts" showLineNumbers
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from '@vercel/ai-utils'
import { OpenAIStream, StreamingTextResponse } from 'ai-connector'

// Create an OpenAI API client (that's edge friendly!)
const config = new Configuration({
Expand Down Expand Up @@ -62,7 +62,7 @@ export async function POST(req: Request) {
}
```

Vercel AI Utils provides 2 utility helpers to make the above seamless: First, we pass the streaming `response` we receive from OpenAI to [`OpenAIStream`](/api-reference#openaistream). This method decodes/extracts the text tokens in the response and then re-encodes them properly for simple consumption. We can then pass that new stream directly to [`StreamingTextResponse`](/api-reference#streamingtextresponse). This is another utility class that extends the normal Node/Edge Runtime `Response` class with the default headers you probably want (hint: `'Content-Type': 'text/plain; charset=utf-8'` is already set for you).
Vercel AI Connector provides 2 utility helpers to make the above seamless: First, we pass the streaming `response` we receive from OpenAI to [`OpenAIStream`](/api-reference#openaistream). This method decodes/extracts the text tokens in the response and then re-encodes them properly for simple consumption. We can then pass that new stream directly to [`StreamingTextResponse`](/api-reference#streamingtextresponse). This is another utility class that extends the normal Node/Edge Runtime `Response` class with the default headers you probably want (hint: `'Content-Type': 'text/plain; charset=utf-8'` is already set for you).

### Wire up the UI

Expand All @@ -72,7 +72,7 @@ By default, the [`useChat`](/api-reference#usechat) hook will use the `POST` Rou
```tsx filename="app/page.tsx" showLineNumbers
'use client'

import { useChat } from '@vercel/ai-utils'
import { useChat } from 'ai-connector'

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat()
Expand Down Expand Up @@ -113,7 +113,7 @@ Similar to the Chat Bot example above, we'll create a Next.js Route Handler that

```tsx filename="app/api/completion/route.ts" showLineNumbers
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from '@vercel/ai-utils'
import { OpenAIStream, StreamingTextResponse } from 'ai-connector'

// Create an OpenAI API client (that's edge friendly!)
const config = new Configuration({
Expand Down Expand Up @@ -149,15 +149,21 @@ We can use the [`useCompletion`](/api-reference#usecompletion) hook to make it e

```tsx filename="app/page.tsx" showLineNumbers
'use client'
import { useCompletion } from '@vercel/ai-utils'

import { useCompletion } from 'ai-connector'

export default function Completion() {
const { completion, input, stop, isLoading, handleInputChange, handleSubmit } =
useCompletion({
api: '/api/completion'
})

const {
completion,
input,
stop,
isLoading,
handleInputChange,
handleSubmit
} = useCompletion({
api: '/api/completion'
})

return (
<div className="mx-auto w-full max-w-md py-24 flex flex-col stretch">
<form onSubmit={handleSubmit}>
Expand Down Expand Up @@ -187,7 +193,6 @@ export default function Completion() {
It’s common to want to save the result of a completion to a database after streaming it back to the user. The `OpenAIStream` adapter accepts a couple of optional callbacks that can be used to do this.

```tsx filename="app/api/completion/route.ts" showLineNumbers

export async function POST(req: Request) {
// ...

Expand Down
Loading

0 comments on commit 161587e

Please sign in to comment.