-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
🚀 Describe the new functionality needed
Add support for the presence_penalty parameter in the /responses API endpoint (both request and response).
This parameter penalizes new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
OpenAPI property paths:
POST.requestBody.content.application/json.properties.presence_penaltyPOST.responses.200.content.application/json.properties.presence_penalty
💡 Why is this needed? What if we don't build it?
OpenAI API conformance is crucial for Llama Stack adoption. The presence_penalty parameter is commonly used to encourage topic diversity.
Without this parameter:
- Users cannot control topic diversity when using the Responses API
- Applications migrating from OpenAI may require code changes
- The Responses API conformance score (currently 80.9%) will remain lower than it could be
Other thoughts
Reference: OpenResponses OpenAPI Spec
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request