Skip to content

ayuguchi/ChatLLM

 
 

Repository files navigation

ChatLLM

Chat test scripts for LLM

Setup

Git clone

$ cd && git clone https://github.com/karaage0703/ChatLLM

Devcontainer(VS Code)

Setup devcontainer

Reference(Japanese)

Docker compose

Docker build

CPU

$ docker compose up base

GPU

$ docker compose up gpu

GPU(Nvidia container)

$ docker compose up nvidia

Docker run

Check docker image name

$ docker ps
$ docker exec -it <image name> /bin/bash

# example
# $ docker exec -it chatllm-nvidia-1 /bin/bash

Run app

root@hostname:/# cd /root
root@hostname:~# python3 chat_calm.py
root@hostname:~# python3 chat_rinna.py
root@hostname:~# python3 chat_rwkv.py
root@hostname:~# python3 chat_llama2.py
root@hostname:~# python3 chat_weblab.py
root@hostname:~# python3 chat_elyza.py

Docker

Docker build

$ cd ~/ChatLLM
$ docker build -t ubuntu:ChatLLM .

Run docker

Use GPU

$ cd ~/ChatLLM
$ docker run -it --rm -v $(pwd):/root --gpus all ubuntu:ChatLLM

Use CPU

$ cd ~/ChatLLM
$ docker run -it --rm -v $(pwd):/root ubuntu:ChatLLM

Run app

root@hostname:~# python3 chat_calm.py
root@hostname:~# python3 chat_rinna.py
root@hostname:~# python3 chat_rwkv.py
root@hostname:~# python3 chat_llama2.py

for stablelm

root@hostname:/# huggingface-cli login
root@hostname:~# python3 chat_stablelm.py

References

About

Test script of LLMs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 85.1%
  • Dockerfile 14.9%