Skip to content

Ollama integration has incorrect args on podman #983

@MatsM16

Description

@MatsM16

Describe the bug

When running Aspire using podman as the container runtime, calling .WithGPUSupport() on an ollama resource does not give the container access to gpu.

According to podman-desktop.io, the required parameters are: --device nvidia.com/gpu=all.
According to this discussion on access.redhat.com, the arguments for AMD-gpus works perfectly as is.

Regression

No response

Steps to reproduce

1. Obtain Nvidia GPU
2. Prepare aspire+podman [Aspire docs](https://aspire.dev/get-started/prerequisites/)
3. Set `examples/ollama/CommunityToolkit.Aspire.Hosting.Ollama.AppHost` as startup project
4. In the apphost, add `.WithGPUSupport()` on one of the ollama resources.
4. Run project.
5. Inspect the running container and observe that it cannot use the gpu.

Expected behavior

Using .WithGPUSupport() is such a smooth way to enable gpu support on local ollama containers.
I would expect the extension method to work regardless of container runtime.

Screenshots

No response

IDE and version

Other

IDE version

Visual Studio 2026

Nuget packages

CommunityToolkit.Aspire.Hosting.Ollama

Additional context

No response

Help us help you

Yes, I'd like to be assigned to work on this item

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions