Skip to content

Commit

Permalink
Add disabling GPU altogether
Browse files Browse the repository at this point in the history
Setting the layers to 0 seemed to do most of the work, but it still used
some of the available GPU. Which was weird. Let's see if this messes up
the CI test times.
  • Loading branch information
jehna committed Aug 24, 2024
1 parent cfbf0d4 commit e161521
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/plugins/local-llm-rename/llama.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ export async function llama(opts: {
model: string;
disableGpu?: boolean;
}): Promise<Prompt> {
const llama = await getLlama();
const llama = await getLlama({ gpu: opts.disableGpu ? false : "auto" });
const modelOpts: LlamaModelOptions = {
modelPath: getModelPath(opts?.model),
gpuLayers: (opts?.disableGpu ?? IS_CI) ? 0 : undefined
Expand Down

0 comments on commit e161521

Please sign in to comment.