Skip to content

run-llama/notebookllama

Repository files navigation

NotebookLlaMa🦙

A fluffy and open-source alternative to NotebookLM!

This project is aimed at producing a fully open-source, LlamaCloud-backed alternative to NotebookLM.

Get it up and running!

Get the GitHub repository:

git clone https://github.com/run-llama/notebooklm-clone

Install dependencies:

cd notebooklm-clone/
uv sync

Modify the .env.example file with your API keys:

Rename the file to .env:

mv .env.example .env

Now, you will have to execute the following scripts:

uv run tools/create_llama_extract_agent.py
uv run tools/create_llama_cloud_index.py

And you're ready to set up the app!

Run the MCP server:

uv run src/notebooklm_clone/server.py

Now, launch the Streamlit app:

streamlit run src/notebooklm_clone/Home.py

Important

You might need to install ffmpeg if you do not have it installed already

And start exploring the app at http://localhost:8751/.

Contributing

Contribute to this project following the guidelines.

License

This project is provided under an MIT License.

About

A fully open-source, LlamaCloud-backed alternative to NotebookLM

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages