Enable QA on websites using LangChain
Python version: 3.10.9
- Update the
.envwith required assistant type. Currently,hfandopenaiare supported. - You should build the knowledgebase first by providing the necessary URLs in the
build_knowledgebase.pyscript and then setting the required environment variables in the.envfile to use the OpenAI models for embedding generation. (refer to the.env.template). URLs to scrape must be specified in the same script. - After that, simply run
docker compose up -dto spin up the container - Finally, open up the browser and go to
http://localhost:8501to access the streamlit UI
To run without containers, build the knowledgebase as explained above and simply run streamlit run ./app.py and access it through http://localhost:8501
⚠️ You will either get charged or your API credits will decrease when you use the OpenAI API Key.
Refer to their website to inquiry about the charges and rate limits associated with their API.- OpenAI API keys might sometimes not get validated properly. Issue is being investigated.
- HuggingFace Models often end up failing to handle context lengths even if
map_reduceDocument Retrieval is being used. Issue is being investigated.