Skip to content

Commit

Permalink
Merge commit '0692cba60baccf6b90a377ec7814173b33d35abf'
Browse files Browse the repository at this point in the history
  • Loading branch information
ex3ndr committed Jan 16, 2024
2 parents 38a77bb + 0692cba commit 987d668
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 5 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Llama Coder is a better and self-hosted Github Copilot replacement for VS Studio

Minimum required RAM: 16GB is a minimum, more is better since even smallest model takes 5GB of RAM.
The best way: dedicated machine with RTX 4090. Install [Ollama](https://ollama.ai) on this machine and configure endpoint in extension settings to offload to this machine.
Second best way: run on MacBooc M1/M2/M3 with enougth RAM (more == better, but 10gb extra would be enougth).
Second best way: run on MacBook M1/M2/M3 with enougth RAM (more == better, but 10gb extra would be enougth).
For windows notebooks: it runs good with decent GPU, but dedicated machine with a good GPU is recommended. Perfect if you have a dedicated gaming PC.

## Local Installation
Expand Down Expand Up @@ -50,7 +50,7 @@ Most of the problems could be seen in output of a plugin in VS Code extension ou

## [0.0.10]
- Adding ability to pick a custom model
- Asking user if he wants to download model if it is not available
- Asking user if they want to download model if it is not available

## [0.0.9]
- Adding deepseek 1b model and making it default
Expand All @@ -69,4 +69,4 @@ Most of the problems could be seen in output of a plugin in VS Code extension ou

## [0.0.4]

- Initial release of Llama Coder
- Initial release of Llama Coder
3 changes: 1 addition & 2 deletions src/extension.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,7 @@ export function activate(context: vscode.ExtensionContext) {
// Create status bar
const openSettings = 'llama.openSettings';
context.subscriptions.push(vscode.commands.registerCommand(openSettings, () => {
// const n = getNumberOfSelectedLines(vscode.window.activeTextEditor);
// vscode.window.showInformationMessage(`Yeah, ${n} line(s) selected... Keep going!`);
vscode.commands.executeCommand('workbench.action.openSettings', '@ext:ex3ndr.llama-coder');
}));
let statusBarItem = vscode.window.createStatusBarItem(vscode.StatusBarAlignment.Right, 100);
statusBarItem.command = openSettings;
Expand Down

0 comments on commit 987d668

Please sign in to comment.