-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Insights: openai/openai-agents-python
Overview
Could not load contribution data
Please try again later
1 Release published by 1 person
-
v0.0.14 v0.0.14
published
Apr 30, 2025
2 Pull requests merged by 1 person
-
0.0.14 release
#635 merged
Apr 30, 2025 -
Update litellm version
#626 merged
Apr 29, 2025
10 Pull requests opened by 9 people
-
feat: use stream api only
#629 opened
Apr 29, 2025 -
[MCP][Utils] Add support for FastMCP processing
#631 opened
Apr 30, 2025 -
feat: patch prompt for models only support json-mode
#633 opened
Apr 30, 2025 -
feat: Implement get_tool_call_output method in RunResultBase and update doc
#637 opened
May 1, 2025 -
feat: pass extra_body through to LiteLLM acompletion
#638 opened
May 1, 2025 -
fix: add ensure_ascii=False to json.dumps for correct Unicode output
#639 opened
May 2, 2025 -
feat: Streamable HTTP support
#643 opened
May 2, 2025 -
[fix] use openai model provider as default
#644 opened
May 3, 2025 -
Examples: Fixed agent_patterns/streaming guardrails
#648 opened
May 5, 2025 -
Fix typos in documentation and event naming across multiple files
#651 opened
May 6, 2025
16 Issues closed by 3 people
-
Integration of deterministic conversations and other agents
#603 closed
May 6, 2025 -
Are MCPServer and MCPServerSse clients?
#640 closed
May 5, 2025 -
how to use Code Interpreter or Image Output in OpenAI Agents SDK
#360 closed
May 5, 2025 -
## Custom Model Provider Not Working
#485 closed
May 5, 2025 -
function call can not get call_id
#559 closed
May 4, 2025 -
How to use llm outputs in the on_handoff function
#567 closed
May 4, 2025 -
Tools should not be exeucted until all input guardrails have completed
#624 closed
May 2, 2025 -
Files in the input user prompt
#557 closed
May 2, 2025 -
from agents.extensions.models.litellm_model import LitellmModel
#621 closed
May 1, 2025 -
Accessing reasoning tokens of another llm model in agents sdk
#462 closed
May 1, 2025 -
[Bug]: ModuleNotFoundError: No module named 'enterprise' When Using litellm==1.48.1 in Google Colab
#614 closed
Apr 30, 2025 -
ModuleNotFoundError: No module named 'enterprise' #10353
#613 closed
Apr 30, 2025 -
Add HTTP (non-stdio) MCP server support to Agents SDK
#616 closed
Apr 29, 2025 -
OpenAI Agents SDK unable to contact local endpoint hosted by Ollama / LM Studio
#625 closed
Apr 29, 2025 -
https://static.hotmart.com/checkout/widget.min.js
#619 closed
Apr 29, 2025
13 Issues opened by 12 people
-
How to use custom LLM Gateway having JWT authetication
#652 opened
May 6, 2025 -
When using Japanese in AzureOpenAI, answers may not be displayed
#649 opened
May 5, 2025 -
Agent gets stuck 'in-progress'
#647 opened
May 5, 2025 -
Providing a pydantic model instead of docstring for tool parameters.
#646 opened
May 5, 2025 -
Is there a way to access reasoning_content when calling Runner.run?
#645 opened
May 4, 2025 -
How to add messages to the conversation history
#642 opened
May 2, 2025 -
Creating Agents Dynamically
#641 opened
May 2, 2025 -
Human-In-The-Loop Architecture should be implemented on top priority!
#636 opened
May 1, 2025 -
Does StopAtTools returns tool result directly to user instead of to LLM?
#632 opened
Apr 30, 2025 -
no attribute error occurs while calling MCP
#630 opened
Apr 30, 2025 -
Intent Classifier Support
#628 opened
Apr 29, 2025 -
How to use on_handoff content in the agent
#627 opened
Apr 29, 2025
24 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Support For CodeAct In The Future?
#383 commented on
Apr 29, 2025 • 0 new comments -
Random transcript gets printed/generated when talking to the voice agent implemented using "VoicePipline" . Eg - "Transcription: Kurs." Mind you there is no background noise.
#368 commented on
Apr 29, 2025 • 0 new comments -
additionalProperties should not be set for object types
#608 commented on
Apr 29, 2025 • 0 new comments -
Timeout after 300 seconds with any error message. Could it be rate limiting?
#511 commented on
Apr 30, 2025 • 0 new comments -
Support for OpenAI agents sdk with Javascript/Typescript
#240 commented on
Apr 30, 2025 • 0 new comments -
invalid_request_error when using "chat_completions" with triage agent (gemini -> any other model)
#237 commented on
May 1, 2025 • 0 new comments -
Add reasoning support for custom models.
#492 commented on
May 1, 2025 • 0 new comments -
Can we use agent.run instead of Runner.run(starting_agent=agent)
#622 commented on
May 1, 2025 • 0 new comments -
Retry mechanism for ModelBehaviorError
#325 commented on
May 2, 2025 • 0 new comments -
Add HTTP Streamable support for MCP's
#600 commented on
May 2, 2025 • 0 new comments -
What is the role of ReasoningItem
#480 commented on
May 5, 2025 • 0 new comments -
human-in-the-loop
#378 commented on
May 5, 2025 • 0 new comments -
Tool Calling Running in Loop Until Max-Turn
#191 commented on
May 5, 2025 • 0 new comments -
Triage agent can not delegate task to handoff agent
#575 commented on
May 6, 2025 • 0 new comments -
Websocket streaming audio in realtime from client
#536 commented on
May 6, 2025 • 0 new comments -
Streamed Voice Agent Demo - Multiple Performance Issues
#301 commented on
May 6, 2025 • 0 new comments -
Add Intro message function for VoicePipeline
#488 commented on
May 6, 2025 • 0 new comments -
Reasoning model items provide to General model
#569 commented on
May 6, 2025 • 0 new comments -
add reasoning content to ChatCompletions
#494 commented on
May 5, 2025 • 0 new comments -
Added cached_tokens to the usage monitoring.
#555 commented on
May 4, 2025 • 0 new comments -
Feature/unified agent endpoint
#558 commented on
May 2, 2025 • 0 new comments -
Add File Loading Utilities for Agent Instructions
#565 commented on
May 3, 2025 • 0 new comments -
Make input/new items available in the run context
#572 commented on
May 3, 2025 • 0 new comments -
Add a new GH Actions job to automatically update translated document pagse
#598 commented on
May 1, 2025 • 0 new comments