Skip to content

haiphucnguyen/askimo

Repository files navigation

Askimo - AI toolkit for your workflows.

askimo.chat · AI for your workflows, with the freedom to choose any provider.

Build License GitHub release DCO

Askimo

Askimo is a provider-agnostic AI toolkit that brings powerful AI capabilities to your command line and automation workflows.
From chatting with LLMs to automating DevOps pipelines, Askimo connects to any provider - OpenAI, X AI, Gemini, Anthropic, or local models like Ollama.

AI for your workflows, with the freedom to choose any provider.


✨ Why Askimo

  • Provider Freedom
    Switch between OpenAI, Gemini, X AI, Anthropic, or Ollama with the same commands. No vendor lock-in as the AI landscape evolves.

  • Automation-First Design
    Built for DevOps and automation workflows. Pipe files, logs, or command output into Askimo and let AI handle analysis, transformation, and decision-making.

  • RAG-Enabled Projects
    Create intelligent project workspaces with built-in vector search (pgvector). Your AI assistant knows your codebase, documentation, and project context.

  • Reusable Recipes
    Build and share parameterized AI workflows. Create templates for code reviews, log analysis, documentation generation, and more.

  • Dual Interface
    Choose your workflow: interactive chat for exploration, or non-interactive mode perfect for scripts, CI/CD pipelines, and automation.

  • Extensible Platform
    Add custom providers, commands, and integrations. Askimo grows with your team's needs.


🎬 Demo

Summarizing files, generating commit messages, and integrating with Git

Askimo Demo 1

Chatting with multiple AI providers

Askimo Demo 2


🧠 Core Capabilities

AI Chat

  • Interactive conversations with multiple AI providers

Knowledge Management

  • RAG-enabled projects with automatic document indexing
  • Vector search powered by PostgreSQL + pgvector
  • Project workspaces that give AI context about your codebase
  • Contextual responses based on your project's files and documentation

Automation & DevOps

  • Pipeline-friendly non-interactive mode for CI/CD integration
  • Recipe system for reusable, parameterized AI workflows
  • Log analysis and system monitoring with AI insights
  • Stdin/stdout support for seamless integration with existing tools

Platform Features

  • Provider-agnostic architecture (OpenAI, Gemini, X AI, Ollama)
  • Extensible plugin system for custom providers and commands
  • Configuration management with per-provider parameter tuning

⚙️ Quickstart

macOS / Linux (Homebrew)

brew tap haiphucnguyen/askimo
brew install askimo
askimo

Windows (Scoop)

scoop bucket add askimo https://github.com/haiphucnguyen/scoop-askimo
scoop install askimo
askimo

Other ways to install → Installation Guide

👉 Once installed, you can connect Askimo to providers like Ollama, OpenAI, Gemini, or X AI and start chatting.

📖 See Getting Started for tutorials on setting up Ollama, adding API keys (OpenAI, Gemini, X AI), switching providers, and running real workflow examples.


💬 CLI Usage

Askimo supports two modes for running commands:

Interactive Mode

Start Askimo without arguments to enter interactive mode:

askimo
askimo> :help

Non-Interactive Mode

Run commands directly from the command line:

askimo --help
askimo --list-providers
askimo --set-provider openai
echo "function add(a, b) { return a + b; }" | askimo -p "Convert this to TypeScript"

Direct Chat (Non-Interactive)

Send a single message to AI without entering interactive mode:

askimo -p "Your prompt here"
askimo --prompt "Your prompt here"

With piped input:

# Analyze code from stdin
echo "function add(a, b) { return a + b; }" | askimo -p "Convert this to TypeScript"

# Process file contents
cat myfile.js | askimo --prompt "Explain this code"

# Analyze git changes
git diff | askimo -p "Summarize these changes"

# Process command output
ls -la | askimo -p "Explain these file permissions"

🧩 Extending Askimo

Askimo is designed to be pluggable, so you can tailor it to your needs:


🤝 Contributing

  • Fork & clone the repo
  • Create a feature branch
  • Open a PR