You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
📖 A blazing tool for book translations, powered by local LLM. Translates your books and documents with impressive quality using a unique two-stage approach.
A Dart client for interacting with the Ollama API. This package provides methods to interact with the Ollama API, including chat, model management, embeddings, and more.
Ollama Local Docker - A simple Docker-based setup for running Ollama's API locally with a web-based UI. Easily deploy and interact with Llama models like llama3.2 and llama3.2:1b on your local machine. This repository provides an efficient, containerized solution for testing and developing AI models using Ollama.
Ollama Chatbot - GitHub Pages Frontend This project provides a static frontend for an Ollama-based chatbot, hosted on GitHub Pages. All AI processing happens locally on the user’s machine via Ollama (http://127.0.0.1:11434). ⚠️ No AI runs on GitHub Pages. The owner is not responsible for usage. No legal action can be taken.
An intelligent AI assistant that learns from both your documents and past conversations to provide context-aware, accurate, and personalized responses.
PlanIt is your intelligent travel planning companion! Powered by state-of-the-art AI technologies, it provides a seamless chat-based interface to help users plan their trips. From selecting a destination to finding the best hotels and managing your budget, PlanIt simplifies the entire process by utilizing the latest in AI and web technologies.