Open
Description
❯ gmake chat
I llama.cpp build info:
I UNAME_S: FreeBSD
I UNAME_P: amd64
I UNAME_M: amd64
I CFLAGS: -I. -O3 -DNDEBUG -std=c11 -fPIC -pthread -mavx -mavx2 -mfma -mf16c
I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread
I LDFLAGS:
I CC: FreeBSD clang version 13.0.0 (git@github.com:llvm/llvm-project.git llvmorg-13.0.0-0-gd7b669b3a303)
I CXX: FreeBSD clang version 13.0.0 (git@github.com:llvm/llvm-project.git llvmorg-13.0.0-0-gd7b669b3a303)
cc -I. -O3 -DNDEBUG -std=c11 -fPIC -pthread -mavx -mavx2 -mfma -mf16c -c ggml.c -o ggml.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread -c utils.cpp -o utils.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread chat.cpp ggml.o utils.o -o chat
❯ ./chat
main: seed = 1679015118
llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ...
fish: Job 1, './chat' terminated by signal SIGILL (Illegal instruction)
FreeBSD 13.1-RELEASE
Metadata
Metadata
Assignees
Labels
No labels