Skip to content

Commit

Permalink
📝 docs: improve ollama docs
Browse files Browse the repository at this point in the history
  • Loading branch information
arvinxx committed Feb 22, 2024
1 parent 6827d20 commit 6485db7
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 4 deletions.
2 changes: 1 addition & 1 deletion docs/self-hosting/examples/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Ollama is a powerful framework for running large language models (LLMs) locally,

This document will guide you on how to configure and deploy LobeChat to use Ollama:

## Locally Launching Ollama Service
## Running Ollama Locally

First, you need to install Ollama. For detailed steps on installing and configuring Ollama, please refer to the [Ollama Website](https://ollama.com).

Expand Down
2 changes: 1 addition & 1 deletion docs/self-hosting/examples/ollama.zh-CN.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Ollama 是一款强大的本地运行大型语言模型(LLM)的框架,支

本文档将指导你如何配置与部署 LobeChat 来使用 Ollama:

##本地启动 Ollama 服务
## 本地启动 Ollama

首先,你需要安装 Ollama,安装与配置 Ollama 的详细步骤可以参考 [Ollama 官方站点](https://ollama.com)

Expand Down
6 changes: 5 additions & 1 deletion docs/usage/providers/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,11 @@ Ollama supports various models, and you can view the available model list in the

Next, you can start conversing with the local LLM using LobeChat.

<Video src="https://github.com/lobehub/lobe-chat/assets/28616219/063788c8-9fef-4c6b-b837-96668ad6bc41" />
<Video
width={832}
height={468}
src="https://github.com/lobehub/lobe-chat/assets/28616219/063788c8-9fef-4c6b-b837-96668ad6bc41"
/>

<Callout type={'info'}>
You can visit [Integrating with Ollama](/en/self-hosting/examples/ollama) to learn how to deploy
Expand Down
6 changes: 5 additions & 1 deletion docs/usage/providers/ollama.zh-CN.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,11 @@ Ollama 支持多种模型,你可以在 [Ollama Library](https://ollama.com/lib

接下来,你就可以使用 LobeChat 与本地 LLM 对话了。

<Video src="https://github.com/lobehub/lobe-chat/assets/28616219/95828c11-0ae5-4dfa-84ed-854124e927a6" />
<Video
width={832}
height={468}
src="https://github.com/lobehub/lobe-chat/assets/28616219/95828c11-0ae5-4dfa-84ed-854124e927a6"
/>

<Callout type={'info'}>
你可以前往 [与 Ollama 集成](/zh/self-hosting/examples/ollama) 了解如何部署 LobeChat ,以满足与
Expand Down

0 comments on commit 6485db7

Please sign in to comment.