Skip to content

This is a unified task-routing AI assistant built using Streamlit and powered by the phi model running locally with Ollama. It supports three intelligent task modes: Q&A, Summarization, Roleplay (Doctor, Lawyer, Customer Support Agent)

License

Notifications You must be signed in to change notification settings

toAspen8848/Task-Based-AI-Assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Smart AI Assistant

This is a unified task-routing AI assistant built using Streamlit and powered by the phi model running locally with Ollama. It supports three intelligent task modes:

  • Q&A
  • Summarization
  • Roleplay (Doctor, Lawyer, Customer Support Agent)

All tasks are routed dynamically with appropriate prompting under the hood. The app features a sleek UI and intelligent input management.


Features

  • Clean, minimal interface
  • Local LLM integration using Ollama
  • Task-aware prompt selection (Q&A, Summarization, Roleplay)
  • Role-based dynamic prompts for Roleplay mode
  • Chat history display with "You" and "AI Assistant" names
  • Input box clears automatically after sending

Requirements

  • Python 3.8+
  • Ollama (running the phi model locally)
  • Required Python libraries:
    pip install streamlit requests

Supported Tasks

1. Q&A

Ask concise, factual questions. The model will generate brief answers.

Prompt Structure:

Answer the following question concisely.
Question: {user_input}
Answer:

2. Summarization

Paste an article, paragraph, or notes. The assistant will summarize it in bullet points.

Prompt Structure:

Summarize the following article in bullet points.
Article: {user_input}
Summary:

3. Roleplay

Choose a role and talk to the assistant like you're speaking to:

  • Doctor
  • Lawyer
  • Customer Support Agent

Prompt Structure Example (Doctor):

You are an experienced medical doctor specialized in general medicine.
The user asks: '{user_input}'
Doctor:

How to Run

  1. Ensure Ollama is installed and running.

  2. Pull the Phi model:

    ollama pull phi
  3. Start your app:

    streamlit run app.py
  4. Open in browser: http://localhost:8501/


Preview

Smart AI Assistant Interface

UI Interface with Q&A option selection

Smart AI Assistant Interface

Option for Role appears upon Roleplay selection

Smart AI Assistant Interface

Demo chat with the AI model


Powered by

  • Streamlit
  • Ollama
  • phi Language Model

Project Structure

Task-Based-AI-Assistant/
├── task_based_ai_assistant.py      # Main Streamlit app
├── images                          # Folder containing preview images
    ├── img1.png
    ├── img2.png
    ├── img3.png
├── README.md                       # README file briefing about the app
├── image1.png                      # AI Bot Image used in UI

📌 Notes

  • Make sure phi is running locally via Ollama API (http://localhost:11434).
  • Internet is not required once the model is downloaded.
  • Easily extendable to include more tasks or roles.

About

This is a unified task-routing AI assistant built using Streamlit and powered by the phi model running locally with Ollama. It supports three intelligent task modes: Q&A, Summarization, Roleplay (Doctor, Lawyer, Customer Support Agent)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages