InfiniQuery is an advanced AI contextual question-answering system that combines the power of Large Language Models (LLMs) with external data sources. Whether you're a researcher, student, or professional, InfiniQuery provides accurate and context-aware answers based on the latest research papers.
- Infinite Context: InfiniQuery can handle extensive context, making it ideal for complex questions that require deep understanding.
- Retrieval-Augmented Generation (RAG): Seamlessly integrates retrieval and generation models for precise responses.
- Stay Current: Leverages external data sources (PDFs, text files, etc.) to keep you informed with the latest research.
-
Installation:
- Clone this repository to your local machine.
- Install the required dependencies using
pip install -r requirements.txt
.
-
Run the App:
- Execute
streamlit run app.py
in your terminal. - Open your web browser and navigate to the provided URL (usually
http://localhost:8501
).
- Execute
-
Usage:
- Select a research paper from the curated list or upload your own.
- Ask questions related to the paper's content.
- InfiniQuery will retrieve relevant information and generate context-aware answers.
app.py
: Main Streamlit application.
Contributions are welcome! If you have ideas for improvements or encounter any issues, feel free to submit a pull request or open an issue.
This project is licensed under the GPL 3.0 License - see the LICENSE file for details.