Skip to content

AI-powered system efficiently manages and resolves user requests... Created at https://coslynx.com

Notifications You must be signed in to change notification settings

coslynx/AI-Powered-Request-Handler

Repository files navigation

AI Powered Request Handler System

A Python backend API for streamlined OpenAI interaction

Developed with the software and tools below.

Framework used for the backend API Programming language used Database used for storage AI models integrated
git-last-commit GitHub commit activity GitHub top language

πŸ“‘ Table of Contents

  • πŸ“ Overview
  • πŸ“¦ Features
  • πŸ“‚ Structure
  • πŸ’» Installation
  • πŸ—οΈ Usage
  • 🌐 Hosting
  • πŸ“„ License
  • πŸ‘ Authors

πŸ“ Overview

This repository contains the code for the AI Powered Request Handler System, a Python backend API designed to simplify user interactions with OpenAI's powerful language models. This MVP provides a user-friendly interface for accessing OpenAI's capabilities without needing extensive technical knowledge.

πŸ“¦ Features

Feature Description
βš™οΈ Architecture The codebase follows a modular architectural pattern with separate directories for different functionalities, ensuring easier maintenance and scalability.
πŸ“„ Documentation The repository includes a README file that provides a detailed overview of the MVP, its dependencies, and usage instructions.
πŸ”— Dependencies The codebase relies on various external libraries and packages such as FastAPI, SQLAlchemy, PyJWT, OpenAI, and Redis, essential for building the API, handling database interactions, and managing user authentication and caching.
🧩 Modularity The modular structure allows for easier maintenance and reusability of the code, with separate directories and files for different functionalities such as controllers, services, and models.
πŸ§ͺ Testing Implement unit tests using frameworks like pytest to ensure the reliability and robustness of the codebase.
⚑️ Performance Optimizes performance through caching mechanisms (Redis) and efficient database query optimization, ensuring fast and responsive service delivery.
πŸ” Security Enhances security by implementing measures such as input validation, data encryption, and secure communication protocols.
πŸ”€ Version Control Utilizes Git for version control with GitLab CI workflow files for automated build and release processes.
πŸ”Œ Integrations Interacts with the OpenAI API, PostgreSQL database, and utilizes Redis for caching, enabling robust functionality.
πŸ“Ά Scalability The architecture allows for horizontal scalability by leveraging containerization (Docker) and database sharding.

πŸ“‚ Structure

β”œβ”€β”€ api
β”‚   β”œβ”€β”€ src
β”‚   β”‚   β”œβ”€β”€ controllers
β”‚   β”‚   β”‚   └── request_controller.py
β”‚   β”‚   β”œβ”€β”€ services
β”‚   β”‚   β”‚   └── request_service.py
β”‚   β”‚   β”œβ”€β”€ models
β”‚   β”‚   β”‚   └── request_model.py
β”‚   β”‚   β”œβ”€β”€ main.py
β”‚   β”‚   └── config
β”‚   β”‚       └── settings.py
β”‚   β”œβ”€β”€ requirements.txt
β”‚   └── startup.sh
β”œβ”€β”€ migrations
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ env.py
β”‚   β”œβ”€β”€ versions.py
β”‚   └── 0001_initial.py
β”œβ”€β”€ tests
β”‚   └── test_request_controller.py
β”œβ”€β”€ celery
β”‚   β”œβ”€β”€ __init__.py
β”‚   └── tasks.py
β”œβ”€β”€ .env
β”œβ”€β”€ Dockerfile
β”œβ”€β”€ docker-compose.yml
└── .gitlab-ci.yml

πŸ’» Installation

πŸ”§ Prerequisites

  • Python 3.9+
  • Docker
  • PostgreSQL
  • Redis

πŸš€ Setup Instructions

  1. Clone the repository:
    git clone https://github.com/coslynx/AI-Powered-Request-Handler.git
    cd AI-Powered-Request-Handler
  2. Install dependencies:
    pip install -r requirements.txt
  3. Set up the database:
    # Create a PostgreSQL database and user
    # Update the DATABASE_URL in your .env file
    # Run database migrations:
    alembic upgrade head
  4. Configure environment variables:
    cp .env.example .env
    # Fill in necessary environment variables like OPENAI_API_KEY, DATABASE_URL, and JWT_SECRET_KEY (optional)

πŸ—οΈ Usage

πŸƒβ€β™‚οΈ Running the MVP

  1. Start the development server using Docker Compose:

    docker-compose up -d
  2. Access the application:

    • API endpoints: http://localhost:8000/docs

βš™οΈ Configuration

  • Environment variables are loaded from the .env file.
  • Configuration settings are defined in api/src/config/settings.py.

πŸ“š Examples

  • Send a request to GPT-3:
    curl -X POST http://localhost:8000/requests \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer YOUR_JWT_TOKEN" \
    -d '{"text": "Write a short story about a cat who travels to space"}'

🌐 Hosting

πŸš€ Deployment Instructions

  1. Build the Docker image:
docker build -t ai-request-handler:latest .
  1. Push the image to a container registry (e.g., Docker Hub):
docker push your-dockerhub-username/ai-request-handler:latest
  1. Deploy using Docker Compose (update the image name in the docker-compose.yml file):
docker-compose up -d

πŸ”‘ Environment Variables

  • OPENAI_API_KEY: Your OpenAI API key.
  • DATABASE_URL: Connection string for your PostgreSQL database.
  • JWT_SECRET_KEY: Secret key for JWT token generation (optional).
  • REDIS_URL: URL for your Redis instance (optional).

πŸ“œ API Documentation

πŸ” Endpoints

  • POST /requests:
    • Description: Send a request to an OpenAI model.
    • Body: { "text": string }
    • Response: { "response": string }

πŸ”’ Authentication

  • For the MVP, authentication is optional. You can implement JWT authentication by following the steps in the auth.py file.

πŸ“œ License & Attribution

πŸ“„ License

This Minimum Viable Product (MVP) is licensed under the GNU AGPLv3 license.

πŸ€– AI-Generated MVP

This MVP was entirely generated using artificial intelligence through CosLynx.com.

No human was directly involved in the coding process of the repository: AI-Powered-Request-Handler

πŸ“ž Contact

For any questions or concerns regarding this AI-generated MVP, please contact CosLynx at:

🌐 CosLynx.com

Create Your Custom MVP in Minutes With CosLynxAI!