Skip to content

llama.cpp server-cuda-b4158 Public Latest

Install from the command line
$ docker pull ghcr.io/firsttimeez/llama.cpp:server-cuda-b4158

Recent tagged image versions

  • Published about 1 year ago · Digest
    sha256:8d0661732e4a9c2a535362c16c8951998ad627a20f7e3a98563396448c000515
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:9073d7c6e7deac541291e730b6a40d045e340cd20e4d141a07c14644aeeaa81e
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:455be1a2e2fe9a2e7e71da9cd8230e3fcc8351c33b41367defb667b4d05af460
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:33a8b1a44cf37e49a9479d94936d658038e1fe55ecfdcaa738a7b3187c6f1ae8
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:998ca323928260f896465d90965eb5a0a7855c762dc98154afcc35c1d0483afe
    2 Version downloads

Loading

Last published

1 year ago

Total downloads

38