Skip to content

Commit

Permalink
Add disabling GPU altogether
Browse files Browse the repository at this point in the history
Setting the layers to 0 seemed to do most of the work, but it still used
some of the available GPU. Which was weird. Let's see if this messes up
the CI test times.
  • Loading branch information
jehna committed Aug 24, 2024
1 parent cfbf0d4 commit 7fb58d4
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions src/plugins/local-llm-rename/llama.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,11 @@ export async function llama(opts: {
model: string;
disableGpu?: boolean;
}): Promise<Prompt> {
const llama = await getLlama();
const disableGpu = opts.disableGpu ?? IS_CI;
const llama = await getLlama({ gpu: disableGpu ? false : "auto" });
const modelOpts: LlamaModelOptions = {
modelPath: getModelPath(opts?.model),
gpuLayers: (opts?.disableGpu ?? IS_CI) ? 0 : undefined
gpuLayers: disableGpu ? 0 : undefined
};
verbose.log("Loading model with options", modelOpts);
const model = await llama.loadModel(modelOpts);
Expand Down

0 comments on commit 7fb58d4

Please sign in to comment.