Skip to content

ikajdan/llem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLEM

LLEM is a simple and self-contained web interface for interacting with a Large Language Model (LLM) that can generate text based on user input. This project provides a Flask-based web application that allows users to chat with a model, giving answers to questions or completing prompts. It can be used with any model that supports the Hugging Face Transformers library.




Demo of the interface.

Features

  • A web interface to interact with a pre-trained Large Language Model.
  • Simple prompt-based interactions where users can chat with the AI.
  • Powered by the Hugging Face transformers library for model handling.

Setup

You can run the application using Docker or manually. The manual setup has only been tested with Python 3.12.

Download the Model

The download_model.py script downloads the pre-trained model from Hugging Face and saves it locally in the model directory. By default, it fetches the SmolLM2-135M-Instruct model.

Make sure the required dependencies are installed and then run the script:

python download_model.py

Docker Setup

The easiest way to run the application is using Docker. This will handle all dependencies and environment setup.

  1. Download the model as described above.
  2. Build the image:
docker compose build
  1. Run the container:
docker compose up

The app will be accessible at http://localhost:5000.

Manual Setup

If you prefer to run the application manually, follow steps below.

  1. Set up a Python virtual environment:
python3 -m venv .venv
source .venv/bin/activate
  1. Install the required dependencies:
pip install -r app/requirements.txt
  1. Start the Flask web server:
python app/app.py

The app will be accessible at http://localhost:5000.

License

This project is licensed under the MIT License. See the LICENSE.md file for more information.

Backround image: Inspiration Geometry.