Skip to content

This project is an AI-powered Agent that uses a Large Language Model (LLM) and Retrieval-Augmented Generation (RAG) to answer your questions based on uploaded documents.

Notifications You must be signed in to change notification settings

AI2Innovate/DocQuery-AI-QA-Agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DocQuery: AI QA Agent that Answers Your Uploaded PDFs and Documents with LLM + RAG

This project is an AI-powered Agent that uses a Large Language Model (LLM) and Retrieval-Augmented Generation (RAG) to answer your questions based on uploaded documents. It is designed to provide accurate and context-aware responses by leveraging the content of the documents provided by the user. The application is containerized with Docker for easy deployment.

πŸ”§ Tech Stack

  • Python, Streamlit
  • LangChain, HuggingFace Transformers
  • ChromaDB (stubbed)
  • Docker

πŸš€ How to Run

git clone <this-repo>
cd llm-rag-faq-bot
pip install -r requirements.txt
uvicorn app.main:app --reload
# Then in a new terminal
streamlit run ui/streamlit_app.py

🐳 Docker

docker build -t llm-rag-faq .
docker run -p 8000:8000 llm-rag-faq

πŸ’¬ Ask Questions

Use the Streamlit UI to upload documents and ask questions. The bot will analyze the uploaded documents and provide answers based on their content.

About

This project is an AI-powered Agent that uses a Large Language Model (LLM) and Retrieval-Augmented Generation (RAG) to answer your questions based on uploaded documents.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published