Skip to content

Commit e16377c

Browse files
authored
Create LLM_related.md
1 parent fc79257 commit e16377c

File tree

1 file changed

+24
-0
lines changed

1 file changed

+24
-0
lines changed

LLM_related.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
## Ollama with qwen3-coder
2+
- Download install.sh
3+
- Edit the OLLAMA_INSTALL_DIR into local SW folder
4+
- Run the script to install
5+
- export OLLAMA_MODELS=/..../ollama/models
6+
- If OLLAMA_MODELS is not defined, ~/.ollama/models will be used
7+
- terminal 1: ollama serve
8+
- terminal 2: ollama run qwen3-coder
9+
- This will download the model and run
10+
11+
## vs code coupling with ollama+qwen2.5-coder
12+
- Qwen3.0-coder is not supported yet (Aug 2025)
13+
- install continue.vsix for vscode
14+
- make sure OLLAM_MODELS is defined if the models are located in non-default location
15+
- terminal 1: ollama serve
16+
- terminal 2: ollama list # make sure if models are detected
17+
- edit ~/.continue/config.yaml
18+
```
19+
models:
20+
- name: Qwen2.5-Coder
21+
provider: ollama
22+
model: qwen2.5-coder:latest
23+
```
24+
- Re-run vscode to load continue package

0 commit comments

Comments
 (0)