Skip to content

Conversation

@joelpaulkoch
Copy link
Member

See A RAG Library for Elixir

We want to turn chatbot_ex into a RAG system that answers question about ecto. Out of the box, chatbot_ex knows nothing about ecto.

In Step one, we ran the generator.
In Step two, we removed the LLM serving as we will use Ollama to generate responses.
In Step three. we set everything up to ingest ecto into our RAG system.

Langchain Integration

As we want to have an ongoing discussion about ecto with the chatbot, we integrate our RAG system with langchain.

Therefore, we need to remove the last step in our generation pipeline, the generation of a response.
Instead we feed the retrieved information into the next langchain message using a small helper function.

@joelpaulkoch joelpaulkoch added BLOG POST EXAMPLE see [A RAG Library for Elixir](https://bitcrowd.dev/a-rag-library-for-elixir#build-your-rag-system) do not merge labels Mar 12, 2025
This was referenced Mar 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

BLOG POST EXAMPLE see [A RAG Library for Elixir](https://bitcrowd.dev/a-rag-library-for-elixir#build-your-rag-system) do not merge

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants