Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 62 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,67 @@
# Release History

## 1.11.0
## 1.12.0

### Bug Fixes

* Registry: Fixed an issue where the string representation of dictionary-type output columns was being incorrectly
created during structured output deserialization. Now, the original data type is properly preserved.

### Behavior Changes

### New Features

* Registry: Add OpenAI chat completion compatible signature option for `text-generation` models.

```python
from snowflake.ml.model import openai_signatures
import pandas as pd

mv = snowflake_registry.log_model(
model=generator,
model_name=...,
...,
signatures=openai_signatures.OPENAI_CHAT_SIGNATURE,
)

# create a pd.DataFrame with openai.client.chat.completions arguments like below:
x_df = pd.DataFrame.from_records(
[
{
"messages": [
{"role": "system", "content": "Complete the sentence."},
{
"role": "user",
"content": "A descendant of the Lost City of Atlantis, who swam to Earth while saying, ",
},
],
"max_completion_tokens": 250,
"temperature": 0.9,
"stop": None,
"n": 3,
"stream": False,
"top_p": 1.0,
"frequency_penalty": 0.1,
"presence_penalty": 0.2,
}
],
)

# OpenAI Chat Completion compatible output
output_df = mv.run(X=x_df)
```

* Model Monitoring: Added support for segment columns to enable filtered analysis.
* Added `segment_columns` parameter to `ModelMonitorSourceConfig` to specify columns for segmenting monitoring data
* Segment columns must be of STRING type and exist in the source table
* Added methods to dynamically manage segments:
* `add_segment_column()`: Add a new segment column to an existing monitor
* `drop_segment_column()`: Remove a segment column from an existing monitor
* Experiment Tracking (PrPr): Support for logging artifacts (files and directories) with `log_artifact`
* Experiment Tracking (PrPr): Support for listing artifacts in a run with `list_artifacts`
* Experiment Tracking (PrPr): Support for downloading artifacts in a run with `download_artifacts`

## 1.11.0 (08-12-2025)

### Bug Fixes

Expand Down
6 changes: 3 additions & 3 deletions bazel/environments/conda-env-all.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ dependencies:
- pytest==7.4.0
- python-build==0.10.0
- pytimeparse==1.1.8
- pytorch==2.3.1
- pytorch==2.5.1
- pyyaml==6.0.1
- requests==2.31.0
- retrying==1.3.3
Expand All @@ -56,7 +56,7 @@ dependencies:
- sentence-transformers==2.7.0
- sentencepiece==0.1.99
- shap==0.46.0
- snowflake-connector-python==3.15.0
- snowflake-connector-python==3.16.0
- snowflake-snowpark-python==1.28.0
- snowflake.core==1.0.5
- sphinx==5.0.2
Expand All @@ -75,7 +75,7 @@ dependencies:
- types-toml==0.10.8.6
- typing-extensions==4.11.0
- werkzeug==2.3.8
- xgboost==2.1.1
- xgboost==2.1.4
- pip
- pip:
- --extra-index-url https://pypi.org/simple
Expand Down
2 changes: 1 addition & 1 deletion bazel/environments/conda-env-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@ dependencies:
- scikit-learn==1.5.1
- toml==0.10.2
- types-toml==0.10.8.6
- xgboost==2.1.1
- xgboost==2.1.4
4 changes: 2 additions & 2 deletions bazel/environments/conda-env-core.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ dependencies:
- scikit-learn==1.5.1
- scipy==1.11.3
- shap==0.46.0
- snowflake-connector-python==3.15.0
- snowflake-connector-python==3.16.0
- snowflake-snowpark-python==1.28.0
- snowflake.core==1.0.5
- sphinx==5.0.2
Expand All @@ -62,7 +62,7 @@ dependencies:
- types-toml==0.10.8.6
- typing-extensions==4.11.0
- werkzeug==2.3.8
- xgboost==2.1.1
- xgboost==2.1.4
- pip
- pip:
- --extra-index-url https://pypi.org/simple
Expand Down
6 changes: 3 additions & 3 deletions bazel/environments/conda-env-keras.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ dependencies:
- pytest==7.4.0
- python-build==0.10.0
- pytimeparse==1.1.8
- pytorch==2.3.1
- pytorch==2.5.1
- pyyaml==6.0.1
- requests==2.31.0
- retrying==1.3.3
Expand All @@ -50,7 +50,7 @@ dependencies:
- scikit-learn==1.5.1
- scipy==1.11.3
- shap==0.46.0
- snowflake-connector-python==3.15.0
- snowflake-connector-python==3.16.0
- snowflake-snowpark-python==1.28.0
- snowflake.core==1.0.5
- sphinx==5.0.2
Expand All @@ -66,7 +66,7 @@ dependencies:
- types-toml==0.10.8.6
- typing-extensions==4.11.0
- werkzeug==2.3.8
- xgboost==2.1.1
- xgboost==2.1.4
- pip
- pip:
- --extra-index-url https://pypi.org/simple
Expand Down
4 changes: 2 additions & 2 deletions bazel/environments/conda-env-ml.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ dependencies:
- scikit-learn==1.5.1
- scipy==1.11.3
- shap==0.46.0
- snowflake-connector-python==3.15.0
- snowflake-connector-python==3.16.0
- snowflake-snowpark-python==1.28.0
- snowflake.core==1.0.5
- sphinx==5.0.2
Expand All @@ -67,7 +67,7 @@ dependencies:
- types-toml==0.10.8.6
- typing-extensions==4.11.0
- werkzeug==2.3.8
- xgboost==2.1.1
- xgboost==2.1.4
- pip
- pip:
- --extra-index-url https://pypi.org/simple
Expand Down
6 changes: 3 additions & 3 deletions bazel/environments/conda-env-torch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ dependencies:
- pytest==7.4.0
- python-build==0.10.0
- pytimeparse==1.1.8
- pytorch==2.3.1
- pytorch==2.5.1
- pyyaml==6.0.1
- requests==2.31.0
- retrying==1.3.3
Expand All @@ -52,7 +52,7 @@ dependencies:
- sentence-transformers==2.7.0
- sentencepiece==0.1.99
- shap==0.46.0
- snowflake-connector-python==3.15.0
- snowflake-connector-python==3.16.0
- snowflake-snowpark-python==1.28.0
- snowflake.core==1.0.5
- sphinx==5.0.2
Expand All @@ -69,7 +69,7 @@ dependencies:
- types-toml==0.10.8.6
- typing-extensions==4.11.0
- werkzeug==2.3.8
- xgboost==2.1.1
- xgboost==2.1.4
- pip
- pip:
- --extra-index-url https://pypi.org/simple
Expand Down
4 changes: 2 additions & 2 deletions bazel/environments/requirements_core.txt
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ s3fs==2024.6.1
scikit-learn==1.5.1
scipy==1.11.3
shap==0.46.0
snowflake-connector-python[pandas]==3.15.0
snowflake-connector-python[pandas]==3.16.0
snowflake-snowpark-python==1.28.0
snowflake.core==1.0.5
sphinx==5.0.2
Expand All @@ -57,4 +57,4 @@ types-requests==2.30.0.0
types-toml==0.10.8.6
typing-extensions==4.11.0
werkzeug==2.3.8
xgboost==2.1.1
xgboost==2.1.4
6 changes: 3 additions & 3 deletions bazel/environments/requirements_keras.txt
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ s3fs==2024.6.1
scikit-learn==1.5.1
scipy==1.11.3
shap==0.46.0
snowflake-connector-python[pandas]==3.15.0
snowflake-connector-python[pandas]==3.16.0
snowflake-snowpark-python==1.28.0
snowflake.core==1.0.5
sphinx==5.0.2
Expand All @@ -52,7 +52,7 @@ starlette==0.27.0
tensorflow==2.17.0
tf-keras==2.17.0
toml==0.10.2
torch==2.3.1
torch==2.5.1
torchdata==0.8.0
tqdm==4.67.1
types-PyYAML==6.0.12.12
Expand All @@ -62,4 +62,4 @@ types-requests==2.30.0.0
types-toml==0.10.8.6
typing-extensions==4.11.0
werkzeug==2.3.8
xgboost==2.1.1
xgboost==2.1.4
4 changes: 2 additions & 2 deletions bazel/environments/requirements_ml.txt
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ s3fs==2024.6.1
scikit-learn==1.5.1
scipy==1.11.3
shap==0.46.0
snowflake-connector-python[pandas]==3.15.0
snowflake-connector-python[pandas]==3.16.0
snowflake-snowpark-python==1.28.0
snowflake.core==1.0.5
sphinx==5.0.2
Expand All @@ -62,4 +62,4 @@ types-requests==2.30.0.0
types-toml==0.10.8.6
typing-extensions==4.11.0
werkzeug==2.3.8
xgboost==2.1.1
xgboost==2.1.4
6 changes: 3 additions & 3 deletions bazel/environments/requirements_torch.txt
Original file line number Diff line number Diff line change
Expand Up @@ -46,15 +46,15 @@ scipy==1.11.3
sentence-transformers==2.7.0
sentencepiece==0.1.99
shap==0.46.0
snowflake-connector-python[pandas]==3.15.0
snowflake-connector-python[pandas]==3.16.0
snowflake-snowpark-python==1.28.0
snowflake.core==1.0.5
sphinx==5.0.2
sqlparse==0.4.4
starlette==0.27.0
tokenizers==0.15.1
toml==0.10.2
torch==2.3.1
torch==2.5.1
torchdata==0.8.0
tqdm==4.67.1
transformers==4.39.3
Expand All @@ -65,4 +65,4 @@ types-requests==2.30.0.0
types-toml==0.10.8.6
typing-extensions==4.11.0
werkzeug==2.3.8
xgboost==2.1.1
xgboost==2.1.4
12 changes: 11 additions & 1 deletion ci/RunBazelAction.sh
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,17 @@ action_env=()
if [[ "${WITH_SPCS_IMAGE}" = true ]]; then
export RUN_GRYPE=false
source model_container_services_deployment/ci/build_and_push_images.sh
action_env=("--action_env=BUILDER_IMAGE_PATH=${BUILDER_IMAGE_PATH}" "--action_env=BASE_CPU_IMAGE_PATH=${BASE_CPU_IMAGE_PATH}" "--action_env=BASE_GPU_IMAGE_PATH=${BASE_GPU_IMAGE_PATH}" "--action_env=IMAGE_BUILD_SIDECAR_CPU_PATH=${IMAGE_BUILD_SIDECAR_CPU_PATH}" "--action_env=IMAGE_BUILD_SIDECAR_GPU_PATH=${IMAGE_BUILD_SIDECAR_GPU_PATH}" "--action_env=PROXY_IMAGE_PATH=${PROXY_IMAGE_PATH}" "--action_env=VLLM_IMAGE_PATH=${VLLM_IMAGE_PATH}")
action_env=(
"--action_env=BUILDER_IMAGE_PATH=${BUILDER_IMAGE_PATH}"
"--action_env=BASE_CPU_IMAGE_PATH=${BASE_CPU_IMAGE_PATH}"
"--action_env=BASE_GPU_IMAGE_PATH=${BASE_GPU_IMAGE_PATH}"
"--action_env=BASE_BATCH_CPU_IMAGE_PATH=${BASE_BATCH_CPU_IMAGE_PATH}"
"--action_env=BASE_BATCH_GPU_IMAGE_PATH=${BASE_BATCH_GPU_IMAGE_PATH}"
"--action_env=IMAGE_BUILD_SIDECAR_CPU_PATH=${IMAGE_BUILD_SIDECAR_CPU_PATH}"
"--action_env=IMAGE_BUILD_SIDECAR_GPU_PATH=${IMAGE_BUILD_SIDECAR_GPU_PATH}"
"--action_env=PROXY_IMAGE_PATH=${PROXY_IMAGE_PATH}"
"--action_env=VLLM_IMAGE_PATH=${VLLM_IMAGE_PATH}"
)
fi

working_dir=$(mktemp -d "/tmp/tmp_XXXXX")
Expand Down
8 changes: 4 additions & 4 deletions ci/conda_recipe/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ build:
noarch: python
package:
name: snowflake-ml-python
version: 1.11.0
version: 1.12.0
requirements:
build:
- python
Expand All @@ -35,18 +35,18 @@ requirements:
- packaging>=20.9,<25
- pandas>=2.1.4,<3
- platformdirs<5
- pyarrow
- pyarrow<19.0.0
- pydantic>=2.8.2, <3
- pyjwt>=2.0.0, <3
- pytimeparse>=1.1.8,<2
- pyyaml>=6.0,<7
- requests
- retrying>=1.3.3,<2
- s3fs>=2024.6.1,<2026
- scikit-learn<1.6
- scikit-learn<1.7
- scipy>=1.9,<2
- shap>=0.46.0,<1
- snowflake-connector-python>=3.15.0,<4
- snowflake-connector-python>=3.16.0,<4
- snowflake-snowpark-python>=1.17.0,<2,!=1.26.0
- snowflake.core>=1.0.2,<2
- sqlparse>=0.4,<1
Expand Down
6 changes: 6 additions & 0 deletions ci/targets/exclude_from_merge_gate.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,9 @@
//tests/integ/snowflake/ml/registry/services:registry_huggingface_pipeline_model_deployment_test
//tests/integ/snowflake/ml/registry/services:registry_sklearn_model_deployment_test
//tests/integ/snowflake/ml/registry/services:registry_model_deployment_test
//snowflake/ml/model/_packager/model_task:model_task_utils_test
//snowflake/ml/model/_packager/model_handlers_test:lightgbm_test
//snowflake/ml/model/_packager/model_handlers_test:snowmlmodel_test
//tests/integ/snowflake/ml/registry/model:registry_target_platforms_test
//tests/integ/snowflake/ml/registry/model:registry_artifact_repository_test
//snowflake/ml/model/_packager/model_handlers_test:xgboost_test
2 changes: 0 additions & 2 deletions ci/targets/quarantine/prod3.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,3 @@
//tests/integ/snowflake/ml/modeling/manifold:spectral_embedding_test
//tests/integ/snowflake/ml/modeling/linear_model:logistic_regression_test
//tests/integ/snowflake/ml/registry/services:registry_huggingface_pipeline_model_deployment_test
//tests/integ/snowflake/ml/registry/services:registry_sentence_transformers_model_deployment_test
//tests/integ/snowflake/ml/jobs:jobs_integ_test
8 changes: 8 additions & 0 deletions codegen/sklearn_wrapper_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -1095,6 +1095,9 @@ def generate(self) -> "SklearnWrapperGenerator":
elif self._is_cross_decomposition_module_obj:
# For cross decomposition, the default n_components need to be set into 1
self.test_estimator_input_args_list.append("n_components=1")
elif self.original_class_name in ["MiniBatchSparsePCA", "SparsePCA"]:
# SparsePCA and MiniBatchSparsePCA need n_components to match input features
self.test_estimator_input_args_list.append("n_components=len(cols)")

if self._is_heterogeneous_ensemble:
if self._is_regressor:
Expand Down Expand Up @@ -1151,6 +1154,11 @@ def generate(self) -> "SklearnWrapperGenerator":
if self._is_hist_gradient_boosting_regressor:
self.test_estimator_input_args_list.extend(["min_samples_leaf=1", "max_leaf_nodes=100"])

# EllipticEnvelope requires a minimum number of support samples for the MCD algorithm
# Setting support_fraction to 0.8 and assume_centered=True for better handling of test datasets
if self.original_class_name == "EllipticEnvelope":
self.test_estimator_input_args_list.extend(["support_fraction=0.8", "assume_centered=True"])

self.deps = (
"f'numpy=={np.__version__}', f'scikit-learn=={sklearn.__version__}', f'cloudpickle=={cp.__version__}'"
)
Expand Down
2 changes: 1 addition & 1 deletion codegen/sklearn_wrapper_template.py_template
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ DATAFRAME_TYPE = Union[DataFrame, pd.DataFrame]

INFER_SIGNATURE_MAX_ROWS = 100

SKLEARN_LOWER, SKLEARN_UPPER = ('1.4', '1.6')
SKLEARN_LOWER, SKLEARN_UPPER = ('1.4', '1.7')
# Modeling library estimators require a smaller sklearn version range.
if not version.Version(SKLEARN_LOWER) <= version.Version(sklearn.__version__) < version.Version(SKLEARN_UPPER):
raise Exception(
Expand Down
Loading