LCORE-634: bump up Llama Stack version to 0.2.20#547
LCORE-634: bump up Llama Stack version to 0.2.20#547tisnik wants to merge 2 commits intolightspeed-core:mainfrom
Conversation
WalkthroughBumps llama-stack and llama-stack-client from 0.2.19 to 0.2.20 in pyproject.toml, relaxes openai version to a minimum constraint, and updates MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION to "0.2.20" in src/constants.py. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~3 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (2)
pyproject.toml(2 hunks)src/constants.py(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: e2e_tests
🔇 Additional comments (2)
src/constants.py (1)
5-5: Confirm 0.2.20 client is backward-compatible with servers 0.2.17–0.2.19 or raise MINIMAL_SUPPORTED_LLAMA_STACK_VERSION to 0.2.20.Definitions: src/constants.py (lines 4–5). Enforcement: src/utils/llama_stack_version.py (compare_versions). Tests referencing these: tests/unit/utils/test_llama_stack_version.py.
pyproject.toml (1)
31-32: Pinned llama-stack 0.2.20 not found upstream — verify and adjustpyproject.toml (lines 31–32): "llama-stack==0.2.20", "llama-stack-client==0.2.20". GitHub releases show no v0.2.20; latest public tag is v0.2.18 (Aug 20, 2025). Confirm whether 0.2.20 is a private/pre-release or a typo. If unpublished, either pin to 0.2.18 or document the private source and add a follow-up to review 0.2.20 release notes and run tests once it is published.
| # OpenAPI exporter | ||
| "email-validator>=2.2.0", | ||
| "openai==1.99.9", | ||
| "openai>=1.99.9", |
There was a problem hiding this comment.
Unbounded OpenAI dep can break builds; add an upper bound.
Use a safe range to avoid accidental 2.x upgrades and ensure reproducibility.
Apply:
- "openai>=1.99.9",
+ "openai>=1.99.9,<2.0.0",📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "openai>=1.99.9", | |
| "openai>=1.99.9,<2.0.0", |
🤖 Prompt for AI Agents
In pyproject.toml around line 46, the OpenAI dependency is unbounded on the
upper side which can allow accidental 2.x upgrades; update the requirement to
include an explicit upper bound (for example constrain it to be <2.0.0) so the
spec reads something like openai>=1.99.9,<2.0.0, then regenerate/update your
lockfile (poetry.lock or equivalent) to ensure reproducible installs.
Description
LCORE-634: bump up Llama Stack version to 0.2.20
Type of change
Related Tickets & Documents
Summary by CodeRabbit