Skip to content

langchain4j/langchain4j-cdi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Langchain4j integration with MicroProfile™ and Jakarta™ specifications

How to run examples

Use LM Studio

Install LM Studio

Download model

Mistral 7B Instruct v0.2

Run

On left goto "local server", select the model in dropdown combo on the top, then start server

Use Ollama

Running Ollama with the llama3.1 model:

CONTAINER_ENGINE=$(command -v podman || command -v docker)
$CONTAINER_ENGINE run -d --rm --name ollama --replace --pull=always -p 11434:11434 -v ollama:/root/.ollama --stop-signal=SIGKILL docker.io/ollama/ollama
$CONTAINER_ENGINE exec -it ollama ollama run llama3.1

Run the examples

Go to each example README.md to see how to execute the example.

Contributing

If you want to contribute, please have a look at CONTRIBUTING.md.

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages