Skip to content

sais-github/llm-sampling

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-sampling

Now updated (by claude :3) to work with the modern version of llama-server.

A very simple interactive demo to understand the common LLM samplers. Released under the Apache License, version 2.0.

See an online demo here. - New probabilities from PocketDoc/Dans-PersonalityEngine-V1.2.0-24b with more options

Go check out the original online demo here. Or original repo here.

Quickstart

How i use it

git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -B build 
cmake --build build --config Release -t llama-server -j 8

# json array of strings (prompts)
$EDITOR in.json

./llama.cpp/build/bin/llama-server --host 127.0.0.1 --port 24498 -c 8192 -np 16 -cb -m foo.gguf
make out.json

Old instrutions

git clone https://github.com/ggerganov/llama.cpp
make -C llama.cpp server

# json array of strings (prompts)
$EDITOR in.json

systemd-run --user ./llama.cpp/server --host 127.0.0.1 --port 24498 -c 8192 -np 16 -cb -m foo.gguf
make out.json

About

A very simple interactive demo to understand the common LLM samplers.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Julia 56.0%
  • JavaScript 21.7%
  • HTML 17.9%
  • PHP 3.9%
  • Makefile 0.5%