Welcome to the LearnDockerLLM repository. This collection features a series of progressively complex examples designed to demonstrate how to integrate Panel, Large Language Models (LLMs) from OpenAI, and Docker. The examples range from foundational setups using Docker Compose to advanced applications where users interact with LLMs via a web interface.
The repository is organized into several directories, each housing a unique example:
- Simple: Demonstrates using Docker Compose to orchestrate a Panel application backed by a PostgreSQL database, showcasing the integration of multiple services.
- SimpleChat: Builds upon Simple to create a basic chat interface with an LLM powered by OpenAI.
- LlmConversation: Loops two LLMs into a conversation with each other, demonstrating complex interaction patterns.
- LlmConsistency: Examines how consistent the responses from an LLM are on math problems generated by another LLM.
- PanelCliExample: Illustrates the foundational structure for connecting a Panel web interface to a secondary target container, allowing for control over SSH.
- LlmCliTool: Provides the capability to use an SSH terminal from the web cooperatively with an LLM, enabling automated insights and command execution assistance in a collaborative environment.
Projects progress in complexity, starting from basic examples to sophisticated cooperative bash shells that integrate user input with OpenAI's LLM capabilities directly from a web browser.
To explore an example, navigate into the respective folder and follow the provided setup instructions. Generally, you'll need to have Docker installed and may need other prerequisites depending on the complexity of the example.
Contributions to this repository are welcome. To contribute, please fork this repository, make your changes, and submit a pull request.
This project is licensed under the MIT License. You are free to use, modify, and distribute this software in accordance with the terms of the license.