llama.cpp server-cuda-b4158 Public Latest
Install from the command line
$ docker pull ghcr.io/firsttimeez/llama.cpp:server-cuda-b4158
linux/amd64
$ docker pull ghcr.io/firsttimeez/llama.cpp:server-cuda-b4158@sha256:6d60583d8bd91b4dc62362c53dcfd11bd7a9ecd61da96fb8d35191d864e2f58b
unknown/unknown
$ docker pull ghcr.io/firsttimeez/llama.cpp:server-cuda-b4158@sha256:c82b62226e7a3d518bb8262e798ababfc820b8c27d8a48db2bf0e4b762ca6818
Recent tagged image versions
- 2 Version downloads
- 2 Version downloads
- 2 Version downloads
- 2 Version downloads
- 2 Version downloads
Loading
Sorry, something went wrong.
Last published
1 year ago
Total downloads