Skip to content

progressively more complex examples designed to demonstrate how to integrate Panel, Large Language Models (LLMs) from OpenAI, and Docker. The examples range from foundational setups using Docker Compose to advanced applications where users interact with LLMs via a web interface, LLMs interact with each other, and LLMs with sandboxed containers

License

Notifications You must be signed in to change notification settings

stepbot/LearnDockerLLM

Repository files navigation

Overview

Welcome to the LearnDockerLLM repository. This collection features a series of progressively complex examples designed to demonstrate how to integrate Panel, Large Language Models (LLMs) from OpenAI, and Docker. The examples range from foundational setups using Docker Compose to advanced applications where users interact with LLMs via a web interface.

Examples

The repository is organized into several directories, each housing a unique example:

  1. Simple: Demonstrates using Docker Compose to orchestrate a Panel application backed by a PostgreSQL database, showcasing the integration of multiple services.
  2. SimpleChat: Builds upon Simple to create a basic chat interface with an LLM powered by OpenAI.
  3. LlmConversation: Loops two LLMs into a conversation with each other, demonstrating complex interaction patterns.
  4. LlmConsistency: Examines how consistent the responses from an LLM are on math problems generated by another LLM.
  5. PanelCliExample: Illustrates the foundational structure for connecting a Panel web interface to a secondary target container, allowing for control over SSH.
  6. LlmCliTool: Provides the capability to use an SSH terminal from the web cooperatively with an LLM, enabling automated insights and command execution assistance in a collaborative environment.

Projects progress in complexity, starting from basic examples to sophisticated cooperative bash shells that integrate user input with OpenAI's LLM capabilities directly from a web browser.

Getting Started

To explore an example, navigate into the respective folder and follow the provided setup instructions. Generally, you'll need to have Docker installed and may need other prerequisites depending on the complexity of the example.

Contributing

Contributions to this repository are welcome. To contribute, please fork this repository, make your changes, and submit a pull request.

License

This project is licensed under the MIT License. You are free to use, modify, and distribute this software in accordance with the terms of the license.

About

progressively more complex examples designed to demonstrate how to integrate Panel, Large Language Models (LLMs) from OpenAI, and Docker. The examples range from foundational setups using Docker Compose to advanced applications where users interact with LLMs via a web interface, LLMs interact with each other, and LLMs with sandboxed containers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published