Skip to content

Commit f5017a1

Browse files
Python: updated onnx deps (#12251)
### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> Updated onnx dependencies, since it now supports both python 3.13 and all platforms. ### Description <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄
1 parent ef8aa2a commit f5017a1

File tree

5 files changed

+3221
-3242
lines changed

5 files changed

+3221
-3242
lines changed

python/pyproject.toml

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -117,8 +117,7 @@ ollama = [
117117
"ollama ~= 0.4"
118118
]
119119
onnx = [
120-
"onnxruntime-genai ~= 0.5; python_version < '3.13' and platform_system != 'Windows'",
121-
"onnxruntime == 1.22.0; platform_system == 'Windows'"
120+
"onnxruntime-genai ~= 0.7"
122121
]
123122
pandas = [
124123
"pandas ~= 2.2"

python/tests/unit/connectors/ai/onnx/services/test_onnx_chat_completion.py

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -11,15 +11,6 @@
1111
from semantic_kernel.kernel import Kernel
1212
from tests.unit.connectors.ai.onnx.conftest import gen_ai_config, gen_ai_config_vision
1313

14-
try:
15-
import onnxruntime_genai # noqa: F401
16-
17-
ready = True
18-
except ImportError:
19-
ready = False
20-
21-
pytestmark = pytest.mark.skipif(not ready, reason="ONNX Runtime is not installed.")
22-
2314

2415
@patch("builtins.open", new_callable=mock_open, read_data=json.dumps(gen_ai_config))
2516
@patch("onnxruntime_genai.Model")

python/tests/unit/connectors/ai/onnx/services/test_onnx_text_completion.py

Lines changed: 1 addition & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -4,23 +4,11 @@
44

55
import pytest
66

7-
from semantic_kernel.connectors.ai.onnx import ( # noqa: E402
8-
OnnxGenAIPromptExecutionSettings,
9-
OnnxGenAITextCompletion,
10-
)
7+
from semantic_kernel.connectors.ai.onnx import OnnxGenAIPromptExecutionSettings, OnnxGenAITextCompletion # noqa: E402
118
from semantic_kernel.contents import TextContent
129
from semantic_kernel.exceptions import ServiceInitializationError
1310
from tests.unit.connectors.ai.onnx.conftest import gen_ai_config
1411

15-
try:
16-
import onnxruntime_genai # noqa: F401
17-
18-
ready = True
19-
except ImportError:
20-
ready = False
21-
22-
pytestmark = pytest.mark.skipif(not ready, reason="ONNX Runtime is not installed.")
23-
2412

2513
@patch("builtins.open", new_callable=mock_open, read_data=json.dumps(gen_ai_config))
2614
@patch("onnxruntime_genai.Model")

python/tests/unit/connectors/ai/onnx/services/test_onnx_utils.py

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,6 @@
11
# Copyright (c) Microsoft. All rights reserved.
2-
from semantic_kernel.connectors.ai.onnx.utils import (
3-
gemma_template,
4-
llama_template,
5-
phi3_template,
6-
phi3v_template,
7-
)
2+
3+
from semantic_kernel.connectors.ai.onnx.utils import gemma_template, llama_template, phi3_template, phi3v_template
84
from semantic_kernel.contents import AuthorRole, ChatHistory, ImageContent, TextContent
95

106

0 commit comments

Comments
 (0)