File tree Expand file tree Collapse file tree 1 file changed +24
-0
lines changed Expand file tree Collapse file tree 1 file changed +24
-0
lines changed Original file line number Diff line number Diff line change
1
+ ## Ollama with qwen3-coder
2
+ - Download install.sh
3
+ - Edit the OLLAMA_INSTALL_DIR into local SW folder
4
+ - Run the script to install
5
+ - export OLLAMA_MODELS=/..../ollama/models
6
+ - If OLLAMA_MODELS is not defined, ~ /.ollama/models will be used
7
+ - terminal 1: ollama serve
8
+ - terminal 2: ollama run qwen3-coder
9
+ - This will download the model and run
10
+
11
+ ## vs code coupling with ollama+qwen2.5-coder
12
+ - Qwen3.0-coder is not supported yet (Aug 2025)
13
+ - install continue.vsix for vscode
14
+ - make sure OLLAM_MODELS is defined if the models are located in non-default location
15
+ - terminal 1: ollama serve
16
+ - terminal 2: ollama list # make sure if models are detected
17
+ - edit ~ /.continue/config.yaml
18
+ ```
19
+ models:
20
+ - name: Qwen2.5-Coder
21
+ provider: ollama
22
+ model: qwen2.5-coder:latest
23
+ ```
24
+ - Re-run vscode to load continue package
You can’t perform that action at this time.
0 commit comments