Skip to content

docs: add Ollama prerequisites to tutorials (#182)#213

Open
kalisingh2277 wants to merge 2 commits intomesa:mainfrom
kalisingh2277:docs/ollama-prerequisites-182
Open

docs: add Ollama prerequisites to tutorials (#182)#213
kalisingh2277 wants to merge 2 commits intomesa:mainfrom
kalisingh2277:docs/ollama-prerequisites-182

Conversation

@kalisingh2277
Copy link

PR Description: Issue #182

Title

docs: add Ollama prerequisites to tutorials (#182)

Description

This PR addresses issue #182 where the introductory tutorials use Ollama as the default LLM provider but do not mention that it must be installed and running. This can lead to frustration for new users who encounter connection errors.

Changes

  • first_model.md: Added a "Prerequisites" section with installation and setup instructions for Ollama.
  • negotiation_model_tutorial.md: Added a "Prerequisites" section with a cross-reference to the main setup guide.
  • getting_started.md: Added an "LLM Backend Setup" section to distinguish between local (Ollama) and cloud (OpenAI/Gemini) providers.

Verification

  • Ran ruff check and ruff format on documentation files.
  • Ran codespell to verify spelling in the new sections.

Related Issues

Closes #182

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 14, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: f08b218b-fd2c-46fc-8405-f04480424661

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

CodeRabbit can use your project's `ruff` configuration to improve the quality of Python code reviews.

Add a Ruff configuration file to your project to customize how CodeRabbit runs ruff.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Weird formatting bug, here


## Prerequisites

Like the introductory tutorial, this guide uses **Ollama** as the default LLM provider.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Like the introductory tutorial, this guide uses **Ollama** as the default LLM provider.
The example uses **Ollama** as the default LLM provider.

There will be new tutorials added later, maybe this makes sense?

Copy link

@IlamaranMagesh IlamaranMagesh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've not looked in depth if the tutorials are in sync with new mesa versions. If it's checked and made necessary changes, it makes sense to update it one go.

… Mesa 4.x model.time

- Refined Ollama prerequisite wording and cross-references.\n- Fixed visual formatting for reasoning output in negotiation tutorial.\n- Migrated core components (LLMAgent, Memory, Recorder) from model.steps to int(model.time) for Mesa 4.x compatibility.\n- Updated test suite (conftest.py and specific tests) to support int(model.time).
@kalisingh2277
Copy link
Author

I've not looked in depth if the tutorials are in sync with new mesa versions. If it's checked and made necessary changes, it makes sense to update it one go.

I have made the necessary changes. Kindly review them and let me know if there are any further enhancements required.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

docs: tutorial should mention that Ollama must be running

2 participants