Skip to content

fix: enhance AzureOpenAIResponsesAPIConfig to support different Azure auth method like we have for completions api. #10871

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

hsuyuming
Copy link

Auth method like we have for completions api.

Enhance AzureOpenAIResponsesAPIConfig to support different Azure auth method

support Entra ID for Azure Auth / Azure Username and Password for Azure Auth / Azure OIDC Token / Azure AD token provider based on Service Principal with Secret workflow for Azure Auth

Relevant issues

Reference to issue #10868

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • [✓] I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • [✓] I have added a screenshot of my new test passing locally
  • [✓] My PR passes all unit tests on make test-unit
  • [✓] My PR's scope is as isolated as possible, it only solves 1 specific problem

Test reuslt


## Result of make test-unit 
Those 3 error is not coming from my change

(.venv) user@abehsu-us-vscode-med:~/abehsu/litellm_clone$ make test-unit
poetry run pytest tests/litellm/
============================================ test session starts ============================================
platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0
rootdir: /home/user/abehsu/litellm_clone
plugins: requests-mock-1.12.1, asyncio-0.21.2, anyio-4.5.2, respx-0.22.0, mock-3.14.0
asyncio: mode=strict
collected 800 items / 3 errors

================================================== ERRORS ===================================================
____________________ ERROR collecting tests/litellm/enterprise/test_enterprise_routes.py ____________________
ImportError while importing test module '/home/user/abehsu/litellm_clone/tests/litellm/enterprise/test_enterprise_routes.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.10/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/litellm/enterprise/test_enterprise_routes.py:13: in
from litellm_enterprise.proxy.enterprise_routes import router
E ModuleNotFoundError: No module named 'litellm_enterprise.proxy'
__________________ ERROR collecting tests/litellm/litellm_core_utils/test_token_counter.py __________________
ImportError while importing test module '/home/user/abehsu/litellm_clone/tests/litellm/litellm_core_utils/test_token_counter.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.10/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/litellm/litellm_core_utils/test_token_counter.py:26: in
from tests.large_text import text
E ModuleNotFoundError: No module named 'tests.large_text'
_______________ ERROR collecting tests/litellm/litellm_core_utils/test_token_counter_tool.py ________________
ImportError while importing test module '/home/user/abehsu/litellm_clone/tests/litellm/litellm_core_utils/test_token_counter_tool.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.10/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/litellm/litellm_core_utils/test_token_counter_tool.py:13: in
from test_token_counter import token_counter
tests/litellm/litellm_core_utils/test_token_counter.py:26: in
from tests.large_text import text
E ModuleNotFoundError: No module named 'tests.large_text'
============================================= warnings summary ==============================================
tests/litellm/integrations/test_custom_prompt_management.py:27
/home/user/abehsu/litellm_clone/tests/litellm/integrations/test_custom_prompt_management.py:27: PytestCollectionWarning: cannot collect test class 'TestCustomPromptManagement' because it has a init constructor (from: tests/litellm/integrations/test_custom_prompt_management.py)
class TestCustomPromptManagement(CustomPromptManagement):

tests/litellm/integrations/arize/test_arize_utils.py:181
/home/user/abehsu/litellm_clone/tests/litellm/integrations/arize/test_arize_utils.py:181: PytestCollectionWarning: cannot collect test class 'TestArizeLogger' because it has a init constructor (from: tests/litellm/integrations/arize/test_arize_utils.py)
class TestArizeLogger(CustomLogger):

tests/litellm/litellm_core_utils/test_streaming_handler.py:505
/home/user/abehsu/litellm_clone/tests/litellm/litellm_core_utils/test_streaming_handler.py:505: PytestUnknownMarkWarning: Unknown pytest.mark.flaky - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
@pytest.mark.flaky(reruns=3)

tests/litellm/llms/ollama/test_ollama_chat_transformation.py:16
/home/user/abehsu/litellm_clone/tests/litellm/llms/ollama/test_ollama_chat_transformation.py:16: PytestCollectionWarning: cannot collect test class 'TestEvent' because it has a init constructor (from: tests/litellm/llms/ollama/test_ollama_chat_transformation.py)
class TestEvent(BaseModel):

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
========================================== short test summary info ==========================================
ERROR tests/litellm/enterprise/test_enterprise_routes.py
ERROR tests/litellm/litellm_core_utils/test_token_counter.py
ERROR tests/litellm/litellm_core_utils/test_token_counter_tool.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 3 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
======================================= 4 warnings, 3 errors in 4.24s =======================================
make: *** [Makefile:29: test-unit] Error 2



## Type

🐛 Bug Fix

## Changes


… Auth method like we have for completions api.
Copy link

vercel bot commented May 15, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 21, 2025 6:33pm

@CLAassistant
Copy link

CLAassistant commented May 15, 2025

CLA assistant check
All committers have signed the CLA.

@hsuyuming
Copy link
Author

@ishaan-jaff i check with the pipeline error, it is cause by module note find, how can i fix this issue?

tests/litellm/enterprise/test_enterprise_routes.py:13: in <module>
    from litellm_enterprise.proxy.enterprise_routes import router
E   ModuleNotFoundError: No module named 'litellm_enterprise.proxy'

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just 1 change

api_key: Optional[str] = None,
) -> dict:
azure_ad_token_provider = litellm_params.get("azure_ad_token_provider")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we make this new block of logic a simple helper in azure/common_utils.py ?

Then we can re-use for new endpoints like /image/edits etc

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean the entire new section of code you added, that should be a simple helper util

@hsuyuming
Copy link
Author

@ishaan-jaff I update my code base on your suggestion, please help me review again when you get time. Thank you!

llms/azure/common_utils.py:416: error: Incompatible types in assignment (expression has type "AsyncClient | None", target has type "str | Callable[[], str] | None")  [assignment]
llms/azure/common_utils.py:418: error: Incompatible types in assignment (expression has type "Client | None", target has type "str | Callable[[], str] | None")  [assignment]
@hsuyuming
Copy link
Author

fix all of CI issues

@krrishdholakia
Copy link
Contributor

@ishaan-jaff is this okay to merge?

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 change, mostly looks good

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@ishaan-jaff ishaan-jaff changed the base branch from main to litellm_staging_azure_responses_api May 21, 2025 20:52
@ishaan-jaff ishaan-jaff merged commit eabe83f into BerriAI:litellm_staging_azure_responses_api May 21, 2025
6 checks passed
@ishaan-jaff
Copy link
Contributor

@hsuyuming
Copy link
Author

hsuyuming commented May 22, 2025

@ishaan-jaff I create a pull request to your litellm_staging_azure_responses_api branch.
I fetch the latest code from origin, then do the rebase. hope this step is correct. If not, feel free to let me know which command i should use. Thank you
#11036

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants