Kolosal AI is an open-source desktop application designed to simplify the training and inference of large language models on your own device. It supports any CPU with AVX2 instructions and also works with AMD and NVIDIA GPUs. Built to be lightweight (only ~20 MB compiled), Kolosal AI runs smoothly on most edge devices, enabling on-premise or on-edge AI solutions without heavy cloud dependencies.
- License: Apache 2.0
- Developer: Genta Technology
- Community: Join our Discord
-
Universal Hardware Support
- AVX2-enabled CPUs
- AMD and NVIDIA GPUs
-
Lightweight & Portable
- Compiled size ~20 MB
- Ideal for edge devices like Raspberry Pi or low-power machines
-
Wide Model Compatibility
- Supports popular models like Mistral, LLaMA, Qwen, and many more
- Powered by the Genta Personal Engine built on top of Llama.cpp
-
Easy Dataset Generation & Training
- Build custom datasets with minimal overhead
- Train models using UnsLOTH or other frameworks
- Deploy locally or as a server in just a few steps
-
On-Premise & On-Edge Focus
- Keeps data private on your own infrastructure
- Lowers costs by avoiding expensive cloud-based solutions
- Local AI Inference: Quickly run LLMs on your personal laptop or desktop for offline or on-premise scenarios.
- Edge Deployment: Bring large language models to devices with limited resources, ensuring minimal latency and improved privacy.
- Custom Model Training: Simplify the process of data preparation and model training without relying on cloud hardware.
We are a small team of students passionate about addressing key concerns in AI such as energy, privacy, on-premise, and on-edge computing. Our flagship product is the Genta Inference Engine, which allows enterprises to deploy open-source models on their own servers, with 3-4x higher throughput. This can reduce operational costs by up to 80%, as a single server optimized by our engine can handle the workload of four standard servers.
- Clone the Repository: [https://github.com/Genta-Technology/Kolosal]
- Join the Community: Ask questions, propose features, and discuss development on our Discord.
- Contribute: We welcome pull requests, bug reports, feature requests, and any kind of feedback to improve Kolosal AI.
- Project Overview
- Directory Structure
- Prerequisites
- Cloning and Preparing the Repository
- Configuring the Project with CMake
- Building the Application
- Running the Application
- Troubleshooting
- Name: Kolosal AI (Desktop application target is
KolosalDesktop
) - Language Standard: C++17
- Build System: CMake (version 3.14 or higher)
- Dependencies (automatically handled by the provided
CMakeLists.txt
, if placed in correct directories):- OpenGL
- OpenSSL
- CURL
- GLAD
- Native File Dialog Extended
- genta-personal engine libraries (InferenceEngineLib, InferenceEngineLibVulkan)
- ImGui (provided in
external/imgui
) - Other external libraries:
stb
,nlohmann/json
,icons
, etc.
A simplified look at the important folders/files:
KolosalAI/
├─ cmake/
│ └─ ucm.cmake # Utility script for static runtime linking
├─ external/
│ ├─ curl/ # Pre-built or source for cURL
│ ├─ glad/ # GLAD loader
│ ├─ genta-personal/ # genta-personal engine includes/libs
│ ├─ imgui/ # ImGui source
│ ├─ nativefiledialog-extended # Native File Dialog Extended
│ ├─ nlohmann/ # JSON library
│ ├─ stb/ # stb (single-file) headers
│ └─ fonts/ # TrueType fonts
├─ assets/
│ ├─ logo.png
│ └─ resource.rc # Windows resource file
├─ source/
│ └─ main.cpp # Entry point for KolosalDesktop
├─ include/
│ └─ ... (additional headers)
├─ models/
│ └─ ... (model.json configuration files used by the inference engine to download, save, and load the model engine)
├─ CMakeLists.txt
├─ README.md # You are here!
└─ ...
- CMake 3.14 or above
Download from https://cmake.org/download/. - A C++17-compatible compiler
- For Windows, Visual Studio 2019/2022 (MSVC) or MinGW-w64 with GCC 7+.
- For other platforms, an equivalent compiler supporting C++17.
- Git (optional, but recommended for cloning and submodule management).
- OpenSSL, CURL
- On Windows, you can place the pre-built bins/headers inside
external/openssl
andexternal/curl
(or anywhere you prefer, just ensureCMakeLists.txt
sees them).
- On Windows, you can place the pre-built bins/headers inside
- (Optional) Vulkan SDK if you plan to use the Vulkan-based inference engine.
-
Clone the repository:
git clone https://github.com/<your_username_or_org>/KolosalAI.git cd KolosalAI
-
(Optional) Update submodules:
If any external libraries are handled as Git submodules, initialize them:git submodule update --init --recursive
-
Check external dependencies:
Ensure theexternal
folder contains:curl
withinclude/
,lib/
, andbin/
(Windows).openssl
or that OpenSSL is installed system-wide.- The
genta-personal
engine in place if not fetched from elsewhere.
-
Folder structure verification:
Verify that folders likenativefiledialog-extended
,imgui
, etc., are present insideexternal/
.
You can perform either an in-source or out-of-source build, but out-of-source is recommended. Below is an example of an out-of-source build:
-
Create a build folder:
mkdir build cd build
-
Run CMake:
By default, this will generate build files for your platform’s default generator (e.g., Visual Studio solution files on Windows, Makefiles on Linux, etc.):cmake -S .. -B . -DCMAKE_BUILD_TYPE=Release
or explicitly (for Visual Studio multi-config):
cmake -S .. -B . -G "Visual Studio 17 2022" -A x64
-
-DDEBUG=ON
can be used if you want to build a debug version:cmake -S .. -B . -DCMAKE_BUILD_TYPE=Debug -DDEBUG=ON
-
-
Check for any errors during configuration, such as missing libraries or headers. Resolve them by installing or copying the required dependencies into the correct location.
After successful configuration:
-
On Windows with Visual Studio:
Open the generated.sln
file insidebuild/
and build the solution. Or build from the command line using:cmake --build . --config Release
-
On other platforms (e.g., using Make or Ninja):
cmake --build . --config Release
Note:
ThePOST_BUILD
commands inCMakeLists.txt
will copy the necessary DLLs, fonts, assets, and models into the final output folder (e.g.,build/Release/
orbuild/Debug/
, depending on your generator).
-
Locate the output:
Once the build completes, you should find the executable (e.g.,KolosalDesktop.exe
on Windows) in a directory such as:build/Release/
(Visual Studio).build/
(single-config generators like Make).
-
Check for required files:
The post-build commands should have copied:- Fonts (
/fonts
folder next to the exe). - Assets (
/assets
folder next to the exe). - Models (
/models
folder next to the exe). - OpenSSL and InferenceEngine DLLs (Windows).
- cURL DLL(s) (Windows).
Make sure these folders and files are present in the same directory as
KolosalDesktop.exe
. - Fonts (
-
Double-click or run from terminal:
cd build/Release ./KolosalDesktop.exe
-
Enjoy Kolosal AI!
-
OpenSSL or CURL not found
- Make sure you have them installed or placed in
external/openssl
andexternal/curl
respectively. - Check environment variables like
OPENSSL_ROOT_DIR
orCURL_ROOT_DIR
if needed. - Update
CMAKE_PREFIX_PATH
if you’re placing these libraries somewhere non-standard.
- Make sure you have them installed or placed in
-
InferenceEngine libraries not found
- Verify the path
external/genta-personal/lib
actually containsInferenceEngineLib.lib
orInferenceEngineLibVulkan.lib
(on Windows). - Adjust
find_library
paths inCMakeLists.txt
if your structure differs.
- Verify the path
-
Missing Vulkan SDK
- If you plan to use the Vulkan-based inference engine, ensure Vulkan SDK is installed and available in your PATH or that CMake can find it.
-
ImGui not found
- Ensure
external/imgui
folder is not empty. - If you see compilation errors referencing ImGui headers, check that
target_include_directories
inCMakeLists.txt
still points to the correct path.
- Ensure
-
Resource or Icon issues on non-Windows
- The
assets/resource.rc
file is Windows-specific. For Linux/macOS builds, you can comment out or remove references to.rc
if they cause issues.
- The
-
Runtime errors due to missing DLLs or dynamic libraries
- Confirm that the post-build step copies all required
.dll
files next to the executable. - On Linux/macOS, ensure
.so
/.dylib
are in the library search path or same folder.
- Confirm that the post-build step copies all required