- π Overview
- πΎ Features
- π Project Structure
- π Getting Started
- π° Contributing
- π License
- π Acknowledgments
LLM-Compare-FastAPI is an innovative open-source project designed to streamline the comparison of various AI language models. It leverages FastAPI and Streamlit to create a user-friendly interface where users can input prompts and view responses from different models, including DeepSeek, OpenAI GPT, Google Gemini, Anthropic Claude, and Cohere Command. The project offers a unique solution for AI enthusiasts, researchers, and developers seeking to evaluate and understand the nuances of different language models in a simple, efficient manner.
Feature | Summary | |
---|---|---|
βοΈ | Architecture |
|
π© | Code Quality |
|
π | Documentation |
|
π | Integrations |
|
π§© | Modularity |
|
π§ͺ | Testing |
|
β‘οΈ | Performance |
|
π‘οΈ | Security |
|
βββ LLM-Compare-FastAPI/
βββ LICENSE
βββ README.md
βββ backend
β βββ Dockerfile
β βββ app
β βββ requirements.txt
βββ docker-compose.yml
βββ frontend
βββ Dockerfile
βββ app.py
βββ requirements.txt
LLM-COMPARE-FASTAPI/
__root__
docker-compose.yml - The docker-compose.yml orchestrates the deployment of a two-tier application architecture, comprising a backend service built with FastAPI and a frontend service using Streamlit
- It ensures both services run on a shared network, with the frontend service dependent on the backend, and both services restarting automatically if they fail.
backend
requirements.txt - Backend/requirements.txt outlines the necessary dependencies for the project, including various language processing libraries and server frameworks
- It ensures the correct packages are installed for the backend to function properly, facilitating language processing tasks and server operations.Dockerfile - The Dockerfile in the backend directory sets up a Python-based environment, installs necessary dependencies from the requirements.txt file, and prepares the application for running on a server
- It enables the application to be containerized and run consistently across different platforms, enhancing the project's portability and scalability.app
main.py - Main.py, located in the backend/app directory, serves as the entry point for the FastAPI LangChain API
- It integrates the application's endpoints and initiates the FastAPI server
- The code's execution starts the server locally on port 8000, enabling the API's functionality.core
config.py - Config.py serves as a central hub for managing API keys within the backend of the application
- It leverages the dotenv module to securely load and store keys for various AI services such as OpenAI, Google AI, Anthropic, and Cohere
- This configuration ensures seamless integration and secure communication with these external services.models.py - The core/models.py module in the backend application serves as an interface for various AI chat models
- It provides functions to interact with GPT-3.5 Turbo, Gemini, Claude-2.1, and ChatCohere models, enabling the sending of prompts and receiving of AI-generated responses
- This module plays a crucial role in facilitating AI-based conversations in the project.api
endpoints.py - The 'endpoints.py' in the backend application serves as the API gateway, providing routes to generate text using different AI models - OpenAI GPT, Google Gemini, Anthropic Claude, and Cohere Command
- It also includes a route to check the API's health status.
frontend
app.py - The frontend/app.py serves as the user interface for the LangChain project, facilitating user interaction with multiple language models
- It allows users to input a prompt, adjust model settings, and view responses from different models, including OpenAI GPT, Google Gemini, Anthropic Claude, and Cohere Command
- The file also handles API calls to the backend and displays response times.requirements.txt - Frontend/requirements.txt outlines the necessary Python packages for the frontend component of the project
- It specifies 'streamlit' and 'requests' as dependencies, ensuring the application's user interface runs smoothly and can make HTTP requests
- This file plays a crucial role in maintaining the project's environment consistency.Dockerfile - The Dockerfile in the frontend directory sets up a Python environment, installs necessary dependencies from the requirements.txt file, and prepares the application for execution
- It specifically configures Streamlit to run the app.py file, making the application accessible via a specified server port and address.
Before getting started with LLM-Compare-FastAPI, ensure your runtime environment meets the following requirements:
- Programming Language: Python
- Package Manager: Pip
- Container Runtime: Docker
Install LLM-Compare-FastAPI using one of the following methods:
Build from source:
- Clone the LLM-Compare-FastAPI repository:
β― git clone https://github.com/serkanyasr/LLM-Compare-FastAPI
- Navigate to the project directory:
β― cd LLM-Compare-FastAPI
- Install the project dependencies:
β― pip install -r backend/requirements.txt, frontend/requirements.txt
β― docker build -t serkanyasr/LLM-Compare-FastAPI .
Run LLM-Compare-FastAPI using the following command:
Using pip
Β
β― python {entrypoint}
β― docker run -it {image_name}
Run the test suite using the following command:
Using pip
Β
β― pytest
- π¬ Join the Discussions: Share your insights, provide feedback, or ask questions.
- π Report Issues: Submit bugs found or log feature requests for the
LLM-Compare-FastAPI
project. - π‘ Submit Pull Requests: Review open PRs, and submit your own PRs.
Contributing Guidelines
- Fork the Repository: Start by forking the project repository to your github account.
- Clone Locally: Clone the forked repository to your local machine using a git client.
git clone https://github.com/serkanyasr/LLM-Compare-FastAPI
- Create a New Branch: Always work on a new branch, giving it a descriptive name.
git checkout -b new-feature-x
- Make Your Changes: Develop and test your changes locally.
- Commit Your Changes: Commit with a clear message describing your updates.
git commit -m 'Implemented new feature x.'
- Push to github: Push the changes to your forked repository.
git push origin new-feature-x
- Submit a Pull Request: Create a PR against the original project repository. Clearly describe the changes and their motivations.
- Review: Once your PR is reviewed and approved, it will be merged into the main branch. Congratulations on your contribution!
This project is licensed under the MIT License.For more details, refer to the LICENSE file.