Skip to content

Comments

Bugbot changelog reminder#1441

Merged
willccbb merged 1 commit intomainfrom
cursor/bugbot-changelog-reminder-90f2
Dec 17, 2025
Merged

Bugbot changelog reminder#1441
willccbb merged 1 commit intomainfrom
cursor/bugbot-changelog-reminder-90f2

Conversation

@willccbb
Copy link
Member

@willccbb willccbb commented Dec 17, 2025

Introduces BUGBOT.md to enforce CHANGELOG.md updates for any PR modifying configuration structures or usage patterns, ensuring better documentation and maintainability.


GitHub Issue: [Issue ID]
Linear Issue: Resolves [Issue ID]


Open in Cursor Open in Web


Note

Adds .cursor/BUGBOT.md with rules requiring CHANGELOG.md updates when config structures or usage patterns change.

  • Docs/Tooling:
    • Add /.cursor/BUGBOT.md detailing changelog enforcement for config changes.
      • Applies to src/prime_rl/*/config.py, src/prime_rl/rl.py, src/prime_rl/utils/config.py.
      • Instructs requesting a CHANGELOG.md entry when such changes are present without an update.

Written by Cursor Bugbot for commit 35a2a4c. This will update automatically on new commits. Configure here.

@cursor
Copy link

cursor bot commented Dec 17, 2025

Cursor Agent can help with this pull request. Just @cursor in comments and I'll start working on changes in this branch.
Learn more about Cursor Agents

@cursor cursor bot force-pushed the cursor/bugbot-changelog-reminder-90f2 branch 2 times, most recently from 3c62006 to c3e28c1 Compare December 17, 2025 07:36
Co-authored-by: williambrown97 <williambrown97@gmail.com>
@cursor cursor bot force-pushed the cursor/bugbot-changelog-reminder-90f2 branch from c3e28c1 to 35a2a4c Compare December 17, 2025 07:37
@willccbb willccbb marked this pull request as ready for review December 17, 2025 07:38
@willccbb willccbb assigned willccbb and unassigned willccbb Dec 17, 2025
@willccbb willccbb merged commit 05165f6 into main Dec 17, 2025
10 checks passed
@willccbb willccbb mentioned this pull request Dec 17, 2025
mikasenghaas added a commit that referenced this pull request Dec 17, 2025
* Move LoRA out of experimental section (#1440)

* Add Bug Bot instructions for changelog enforcement (#1441)

Co-authored-by: Cursor Agent <cursoragent@cursor.com>

* duplicate chat completinos endpoint into /generate

* serve chat with token in functionality

* use field to avoid misleading warning

* nicer error msg

* lock feature branch

* make use tokens prompt configurable

* use setter and print info

* bump

* include inference

* do not print warning log (logs all the time)

* bump

* bump + bring back warning log

* bump vf

* bump vf

* use dp=6 in wordle example

* no deepcopy and no warning

* do not tokenize on the server

* add field names so that tokens is cached and no warning of unrecognized field is shown

* bump vf

* auto install

* bump vf

* bump vf + set vllm tokenize method

* skip applying chat template

* Revert "skip applying chat template"

This reverts commit 43c6a2b.

* Revert "do not tokenize on the server"

This reverts commit 9182191.

* bring back log

* use route /v1/chat/completions/tokens

* fix log

* bump vf and make everything configurable

* bump and more informative log

* bump and make non-exact tokenization default

* use token prompts by default

* remove retokenization issue from docs

* rename class

* bump vf

* fix auto asc setup for lora

* bump vf

* bump vf

* bump vf

* bring back setter

* bump vf

* bump vf to latest prime-rl

* make custom routes v0.12.0 compatible

* monkey patch api server worker proc again to enable multi api server mode

---------

Co-authored-by: will brown <williambrown97@gmail.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
samsja pushed a commit that referenced this pull request Dec 18, 2025
* dont use enum setter for logprobs mode

* fix: stale imports

* update to torch 2.9

* init_app_state doesnt take vllm config anymore somehow

* use runnable because of CUDAGraphWrapper

* vllm now uses default seed 0

* fix import

* moe venv

* use mjun flash attn for torch 2.9 and up vllm version

* Revert "moe venv"

This reverts commit 8934ceb.

* remove some todos

* remove unused import

* Apply suggestions from code review

Signed-off-by: Jackmin801 <56836461+Jackmin801@users.noreply.github.com>

* set very high max cpu loras to patch around areal lora hack

* make flash attn optional and put uv sync extras everywhere

* Integrate token-in route with vLLM v0.12.0 (#1444)

* Move LoRA out of experimental section (#1440)

* Add Bug Bot instructions for changelog enforcement (#1441)

Co-authored-by: Cursor Agent <cursoragent@cursor.com>

* duplicate chat completinos endpoint into /generate

* serve chat with token in functionality

* use field to avoid misleading warning

* nicer error msg

* lock feature branch

* make use tokens prompt configurable

* use setter and print info

* bump

* include inference

* do not print warning log (logs all the time)

* bump

* bump + bring back warning log

* bump vf

* bump vf

* use dp=6 in wordle example

* no deepcopy and no warning

* do not tokenize on the server

* add field names so that tokens is cached and no warning of unrecognized field is shown

* bump vf

* auto install

* bump vf

* bump vf + set vllm tokenize method

* skip applying chat template

* Revert "skip applying chat template"

This reverts commit 43c6a2b.

* Revert "do not tokenize on the server"

This reverts commit 9182191.

* bring back log

* use route /v1/chat/completions/tokens

* fix log

* bump vf and make everything configurable

* bump and more informative log

* bump and make non-exact tokenization default

* use token prompts by default

* remove retokenization issue from docs

* rename class

* bump vf

* fix auto asc setup for lora

* bump vf

* bump vf

* bump vf

* bring back setter

* bump vf

* bump vf to latest prime-rl

* make custom routes v0.12.0 compatible

* monkey patch api server worker proc again to enable multi api server mode

---------

Co-authored-by: will brown <williambrown97@gmail.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>

---------

Signed-off-by: Jackmin801 <56836461+Jackmin801@users.noreply.github.com>
Co-authored-by: Mika Senghaas <mail@mikasenghaas.de>
Co-authored-by: will brown <williambrown97@gmail.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants