Skip to content

Conversation

@mo374z
Copy link
Collaborator

@mo374z mo374z commented Mar 4, 2025

No description provided.

@mo374z
Copy link
Collaborator Author

mo374z commented Mar 4, 2025

I think the llm_test_run.py script should be moved to CAPO in the end to make it reproducible - dont know if we also want to keep it here since it provides a nice way of testing the used LLMS also potentially helpful for other users of promptolution

@finitearth
Copy link
Owner

I think the llm_test_run.py script should be moved to CAPO in the end to make it reproducible - dont know if we also want to keep it here since it provides a nice way of testing the used LLMS also potentially helpful for other users of promptolution

I'd vote for moving it to CAPO, as this script is not applying any prompt optimization techniques

@mo374z mo374z merged commit b6440c7 into dev Mar 5, 2025
1 check passed
@mo374z mo374z deleted the feature/vllm branch March 5, 2025 22:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants