Skip to content

feat(llama.cpp/clip): inject gpu options if we detect GPUs #5243

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 25, 2025

Conversation

mudler
Copy link
Owner

@mudler mudler commented Apr 25, 2025

Description

This PR fixes #4815

We try to guess if we have GPUs and we set options gpu if not already configured. guessing of defaults can be always disabled with LOCALAI_DISABLE_GUESSING=true

Notes for Reviewers

Signed commits

  • Yes, I signed my commits.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Copy link

netlify bot commented Apr 25, 2025

Deploy Preview for localai ready!

Name Link
🔨 Latest commit c05a08b
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/680be3664758670008e38821
😎 Deploy Preview https://deploy-preview-5243--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@mudler mudler merged commit 9628860 into master Apr 25, 2025
25 checks passed
@mudler mudler deleted the feat/detect_gpu_options branch April 25, 2025 22:04
@mudler mudler added the enhancement New feature or request label May 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

GPT-4o/Vision models cannot use GPU due to CLIP changes
1 participant