docs: add Ollama prerequisites to tutorials (#182)#213
docs: add Ollama prerequisites to tutorials (#182)#213kalisingh2277 wants to merge 2 commits intomesa:mainfrom
Conversation
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment Tip CodeRabbit can use your project's `ruff` configuration to improve the quality of Python code reviews.Add a Ruff configuration file to your project to customize how CodeRabbit runs |
|
|
||
| ## Prerequisites | ||
|
|
||
| Like the introductory tutorial, this guide uses **Ollama** as the default LLM provider. |
There was a problem hiding this comment.
| Like the introductory tutorial, this guide uses **Ollama** as the default LLM provider. | |
| The example uses **Ollama** as the default LLM provider. |
There will be new tutorials added later, maybe this makes sense?
IlamaranMagesh
left a comment
There was a problem hiding this comment.
I've not looked in depth if the tutorials are in sync with new mesa versions. If it's checked and made necessary changes, it makes sense to update it one go.
… Mesa 4.x model.time - Refined Ollama prerequisite wording and cross-references.\n- Fixed visual formatting for reasoning output in negotiation tutorial.\n- Migrated core components (LLMAgent, Memory, Recorder) from model.steps to int(model.time) for Mesa 4.x compatibility.\n- Updated test suite (conftest.py and specific tests) to support int(model.time).
I have made the necessary changes. Kindly review them and let me know if there are any further enhancements required. |
PR Description: Issue #182
Title
docs: add Ollama prerequisites to tutorials (#182)
Description
This PR addresses issue #182 where the introductory tutorials use Ollama as the default LLM provider but do not mention that it must be installed and running. This can lead to frustration for new users who encounter connection errors.
Changes
Verification
ruff checkandruff formaton documentation files.codespellto verify spelling in the new sections.Related Issues
Closes #182