-
Notifications
You must be signed in to change notification settings - Fork 3.3k
Open
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:linuxHappening specifically on LinuxHappening specifically on Linuxpriority:mediumIndicates medium priorityIndicates medium priority
Description
Before submitting your bug report
- I believe this is a bug. I'll try to join the Continue Discord for questions
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:Linux
- Continue version:0.8.68
- IDE version:1.97.0
- Model:deepseek-coder:1.3b
- config:
{
"models": [
{
"apiBasse": "https://41245a9c.r8.cpolar.top",
"model": "deepseek-coder:1.3b",
"title": "deepseek_yuyx",
"provider": "ollama"
}
],
"tabAutocompleteModel": {
"apiBasse": "https://41245a9c.r8.cpolar.top",
"model": "deepseek-coder:1.3b",
"title": "deepseek_yuyx",
"provider": "ollama"
},
"contextProviders": [
{
"name": "code",
"params": {}
},
{
"name": "docs",
"params": {}
},
{
"name": "diff",
"params": {}
},
{
"name": "terminal",
"params": {}
},
{
"name": "problems",
"params": {}
},
{
"name": "folder",
"params": {}
},
{
"name": "codebase",
"params": {}
}
],
"slashCommands": [
{
"name": "share",
"description": "Export the current chat session to markdown"
},
{
"name": "cmd",
"description": "Generate a shell command"
},
{
"name": "commit",
"description": "Generate a git commit message"
}
]
}
Description
My colleague ran the DeepSeek model on his computer using Olama and deployed the model to the public network using a private network penetration tool. After configuring apiBase using the continue plugin in VSCode, I was unable to use it and encountered the error 'Unable to connect to local Olama instance' Olama may not be running. 'However, if I use the same config. json on Windows, it can be used normally. It looks like a bug in the Linux version
To reproduce
- install continue plugin in VScode
- edit config.json as this bug
- click on enter
- see Unable to connect to local Olama instance' Olama may not be running
Log output
Metadata
Metadata
Assignees
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:linuxHappening specifically on LinuxHappening specifically on Linuxpriority:mediumIndicates medium priorityIndicates medium priority