Skip to content

Commit

Permalink
parent befbb42
Browse files Browse the repository at this point in the history
author Kartik Gupta <88345179+kartikgupta-db@users.noreply.github.com> 1688609182 +0200
committer Spece <Michael.Spece@Chubb.Com> 1689629188 -0400
gpgsig -----BEGIN PGP SIGNATURE-----

 iQGzBAABCAAdFiEEmFkdmlNpeHaohb9psG3XEklX7TkFAmS1sgQACgkQsG3XEklX
 7Tl6Cgv9GDR+CQVJBr7SrITDS41HgQLIN/s/o3o/gX7htrYC9LTW95hKSUXmirMO
 6+ZfI/rJpgB4nwe7fW47mn+nkXS+RrTO+LVZ3mzhiOunNGyGNix8BpjusjSmIxmB
 kCfgNLjQ9kLVLN1MRrYD4G4eg5zIkjgdNGMpZ1zxOcxue7Vd+qd5nGusUAiBEZIN
 ok5bw/VrfEBhYFx8S+XS3+OQpXAECc/tO4jmPghq5LIaP3SnlqS3/8MrYZOAP4kT
 zvmT+l/ySfAtl49I3tV5gDzB1myu717l2rzyE9JovDg4JIpdB+F3ZkYupgKozw8+
 RhdM+4NH+0w+A4NhgGKA+fn80cK2ZW9fNLjSqv5JQa2ppb/LJNs+ZiCKQVmNA/jP
 dYNI++d3sc+lrNlGHjZPQ8NQBJYxMuOC9OKTLC4vEJQ8Cc2hXUKlAYCqIvTFDoKP
 a21ynsDe8L+CDAKT6u3jSiUbNUlhl8cck4pFsRbVZYiikuBidtlHHLoXmHjeSayZ
 N8eiCeeh
 =15Z+
 -----END PGP SIGNATURE-----

parent befbb42
author Kartik Gupta <88345179+kartikgupta-db@users.noreply.github.com> 1688609182 +0200
committer Spece <Michael.Spece@Chubb.Com> 1689629116 -0400
gpgsig -----BEGIN PGP SIGNATURE-----

 iQGzBAABCAAdFiEEmFkdmlNpeHaohb9psG3XEklX7TkFAmS1sbwACgkQsG3XEklX
 7TnEJgv+PhwF46QO+N5yEQNdzX8sCi/7pkZePiELGSzjkqEL75wNxrYX22PKaW6V
 2ThXk7wczONZYozzadZzB72uZ+jqm5xAtr/QaOZhz8h/xJ79IsuGT+rtA198mCjv
 k+G/2iZzAb5Jcs08X58YrZJCYPQDPTXmElyRUskMhiO2wjmVgcL80JpHk4UKLBfU
 2m3ZJY/ZSBFfBdrHCqHUVfyq8KgC7dcxEApgX4ZNb0eE0wC9PbtrfIzgkrTGuMPE
 lD0Vp7QIAJozWeO5SJe7HHxSQl3qTSiADjZC2wMQQ4a87eKC6g0hDKt7rTGo75s0
 p9UO+MUJHh3/QcHkSIzRYOzkPjLOGIhpnMWAufJNOxRVJdiT0xuDFo2bOZ6JMgRV
 u2BrGa1ujC+sAgmicuAtBZeC5lf8mKpBgW2/VWFZN7UOMeSxZewgSd9G4xLxGOvt
 C2htT9yH4zsjHZLAlUqQp8LN4QjCo3lSB/0CBXMNq2FjCI2FKhdjScyJ4hxd++Ol
 M01P9FU/
 =F3cq
 -----END PGP SIGNATURE-----

[DECO-1115] Add local implementation for `dbutils.widgets` (databricks#93)

* Added a new install group (`pip install 'databricks-sdk[notebook]'`).
This allows us to safely pin ipywidgets for local installs. DBR can
safely continue using `pip install databricks-sdk` or directly using the
default build from master without conflicting dependencies.
* OSS implementation of widgets is imported only on first use (possible
only through OSS implementation of dbutils - `RemoteDbutils`).
* Add a wrapper for ipywidgets to enable interactive widgets when in
interactive **IPython** notebooks.

https://user-images.githubusercontent.com/88345179/236443693-1c804107-ba21-4296-ba40-2b1e8e062d16.mov

* Add default widgets implementation that returns a default value, when
not in an interactive environment.

https://user-images.githubusercontent.com/88345179/236443729-51185404-4d28-49c6-ade0-a665e154e092.mov

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Fix error message, ExportFormat -> ImportFormat (databricks#220)

The proper argument is ImportFormat.AUTO, not ExportFormat.AUTO

Correct the error message when `ImportFormat` is not provided to
`workspace.upload`.

Signed-off-by: Jessica Smith <8505845+NodeJSmith@users.noreply.github.com>

Regenerate Python SDK using recent OpenAPI Specification (databricks#229)

Spec commit sha: 17a3f7fe6 (7 July 2023)

Breaking Changes:
* Use CONSTANT_CASE for Enum constants. Many enums already use constant
case in their definition, but some constants (like the SCIM Patch schema
name) includes symbols `:` and numbers, so the SDK cannot use the enum
value as the name.
* Replace Query type with AlertQuery in sql.Alert class.
* Removal of User.is_db_admin and User.profile_image_url.

Changes:
* Introduce CleanRooms API
* Introduce TablesAPI.update()
* Introduce Group.meta property
* Fix SCIM Patch implementation
* Introduce BaseRun.job_parameters and BaseRun.trigger_info
* Introduce CreateJob.parameters
* Fix spelling in file arrival trigger configuration
* Introduce GitSource.job_source
* Introduce RepairRun.rerun_dependent_tasks
* Introduce Resolved*Values classes, RunIf, and RunJobTask
* Introduce TaskNotificationSettings

Later follow-up:
* Names should split on Pascal-case word boundaries (see
CloudProviderNodeStatus). This is an OpenAPI code gen change that needs
to be made.

Make workspace client also return runtime dbutils when in dbr (databricks#210)

* `workspace_client.dbutils` always returned oss implementation of
dbutils. We want it to also use dbr implementation when in dbr.

* [x] Manually Test in dbr
* [x] Test locally

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Use .ConstantName defining target enum states for waiters (databricks#230)

Uses of enums in generated code need to be updated to use
`{{.ConstantName}}` instead of `{{.Content}}`.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

Fix enum deserialization (databricks#234)

In databricks#230, enums were changed so that enum field names did not necessarily
match the enum value itself. However, the `_enum` helper method used
during deserialization of a response containing an enum was not updated
to handle this case. This PR corrects this method to check through the
values of the `__members__` of an enum, as opposed to the keys.

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

Fix enum deserialization, take 2 (databricks#235)

We jumped the gun too quickly on databricks#234. This is the actual change which
fixes the integration tests.

- [x] The two failing integration tests (test_submitting_jobs and
test_proxy_dbfs_mounts) both pass on this PR.

Added toolchain configuration to `.codegen.json` (databricks#236)

- Added toolchain config for automated releases
- Added `CHANGELOG.md` template with OpenAPI SHA

prep release changes

Make OpenAPI spec location configurable (databricks#237)

Introducing `DATABRICKS_OPENAPI_SPEC` environment variable to hold a
filesystem location of `all-internal.json` spec.

Rearrange imports in `databricks.sdk.runtime` to improve local editor experience (databricks#219)

<!-- Summary of your changes that are easy to understand -->

The type hinting here means that VSCode does not give useful syntax
highlights / code completions.

This is the current experience on `main`:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e">
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9">

With these changes this becomes:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80">
<img width="818" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49">

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

[DECO-1115] Add local implementation for `dbutils.widgets` (databricks#93)

* Added a new install group (`pip install 'databricks-sdk[notebook]'`).
This allows us to safely pin ipywidgets for local installs. DBR can
safely continue using `pip install databricks-sdk` or directly using the
default build from master without conflicting dependencies.
* OSS implementation of widgets is imported only on first use (possible
only through OSS implementation of dbutils - `RemoteDbutils`).
* Add a wrapper for ipywidgets to enable interactive widgets when in
interactive **IPython** notebooks.

https://user-images.githubusercontent.com/88345179/236443693-1c804107-ba21-4296-ba40-2b1e8e062d16.mov

* Add default widgets implementation that returns a default value, when
not in an interactive environment.

https://user-images.githubusercontent.com/88345179/236443729-51185404-4d28-49c6-ade0-a665e154e092.mov

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Fix error message, ExportFormat -> ImportFormat (databricks#220)

The proper argument is ImportFormat.AUTO, not ExportFormat.AUTO

Correct the error message when `ImportFormat` is not provided to
`workspace.upload`.

Signed-off-by: Jessica Smith <8505845+NodeJSmith@users.noreply.github.com>

Regenerate Python SDK using recent OpenAPI Specification (databricks#229)

Spec commit sha: 17a3f7fe6 (7 July 2023)

Breaking Changes:
* Use CONSTANT_CASE for Enum constants. Many enums already use constant
case in their definition, but some constants (like the SCIM Patch schema
name) includes symbols `:` and numbers, so the SDK cannot use the enum
value as the name.
* Replace Query type with AlertQuery in sql.Alert class.
* Removal of User.is_db_admin and User.profile_image_url.

Changes:
* Introduce CleanRooms API
* Introduce TablesAPI.update()
* Introduce Group.meta property
* Fix SCIM Patch implementation
* Introduce BaseRun.job_parameters and BaseRun.trigger_info
* Introduce CreateJob.parameters
* Fix spelling in file arrival trigger configuration
* Introduce GitSource.job_source
* Introduce RepairRun.rerun_dependent_tasks
* Introduce Resolved*Values classes, RunIf, and RunJobTask
* Introduce TaskNotificationSettings

Later follow-up:
* Names should split on Pascal-case word boundaries (see
CloudProviderNodeStatus). This is an OpenAPI code gen change that needs
to be made.

Make workspace client also return runtime dbutils when in dbr (databricks#210)

* `workspace_client.dbutils` always returned oss implementation of
dbutils. We want it to also use dbr implementation when in dbr.

* [x] Manually Test in dbr
* [x] Test locally

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Use .ConstantName defining target enum states for waiters (databricks#230)

Uses of enums in generated code need to be updated to use
`{{.ConstantName}}` instead of `{{.Content}}`.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

Fix enum deserialization, take 2 (databricks#235)

We jumped the gun too quickly on databricks#234. This is the actual change which
fixes the integration tests.

- [x] The two failing integration tests (test_submitting_jobs and
test_proxy_dbfs_mounts) both pass on this PR.

Added toolchain configuration to `.codegen.json` (databricks#236)

- Added toolchain config for automated releases
- Added `CHANGELOG.md` template with OpenAPI SHA

prep release changes

Make OpenAPI spec location configurable (databricks#237)

Introducing `DATABRICKS_OPENAPI_SPEC` environment variable to hold a
filesystem location of `all-internal.json` spec.

Rearrange imports in `databricks.sdk.runtime` to improve local editor experience (databricks#219)

<!-- Summary of your changes that are easy to understand -->

The type hinting here means that VSCode does not give useful syntax
highlights / code completions.

This is the current experience on `main`:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e">
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9">

With these changes this becomes:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80">
<img width="818" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49">

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
  • Loading branch information
kartikgupta-db authored and Spece committed Jul 17, 2023
1 parent befbb42 commit 3f9bfb7
Show file tree
Hide file tree
Showing 33 changed files with 2,905 additions and 1,498 deletions.
20 changes: 19 additions & 1 deletion .codegen.json
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,23 @@
},
"samples": {
".codegen/example.py.tmpl": "examples/{{.Service.SnakeName}}/{{.Method.SnakeName}}_{{.SnakeName}}.py"
},
"version": {
"databricks/sdk/version.py": "__version__ = '$VERSION'"
},
"toolchain": {
"required": ["python3"],
"pre_setup": [
"python3 -m venv .databricks"
],
"prepend_path": ".databricks/bin",
"setup": [
"pip install '.[dev]'"
],
"post_generate": [
"pytest -m 'not integration' --cov=databricks --cov-report html tests",
"pip install .",
"python docs/gen-client-docs.py"
]
}
}
}
18 changes: 17 additions & 1 deletion .codegen/__init__.py.tmpl
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,22 @@ from databricks.sdk.service.{{.Package.Name}} import {{.PascalName}}API{{end}}
{{- getOrDefault $mixins $genApi $genApi -}}
{{- end -}}

def _make_dbutils(config: client.Config):
# We try to directly check if we are in runtime, instead of
# trying to import from databricks.sdk.runtime. This is to prevent
# remote dbutils from being created without the config, which is both
# expensive (will need to check all credential providers) and can
# throw errors (when no env vars are set).
try:
from dbruntime import UserNamespaceInitializer
except ImportError:
return dbutils.RemoteDbUtils(config)

# We are in runtime, so we can use the runtime dbutils
from databricks.sdk.runtime import dbutils as runtime_dbutils
return runtime_dbutils


class WorkspaceClient:
def __init__(self, *{{range $args}}, {{.}}: str = None{{end}},
debug_truncate_bytes: int = None,
Expand All @@ -33,7 +49,7 @@ class WorkspaceClient:
product=product,
product_version=product_version)
self.config = config.copy()
self.dbutils = dbutils.RemoteDbUtils(self.config)
self.dbutils = _make_dbutils(self.config)
self.api_client = client.ApiClient(self.config)
self.files = FilesMixin(self.api_client)
{{- range .Services}}{{if not .IsAccounts}}
Expand Down
55 changes: 55 additions & 0 deletions .codegen/changelog.md.tmpl
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# Version changelog

## {{.Version}}

{{range .Changes -}}
* {{.}}.
{{end}}{{- if .ApiChanges}}
API Changes:
{{range .ApiChanges}}
* {{.Action}} {{template "what" .}}{{if .Extra}} {{.Extra}}{{with .Other}} {{template "what" .}}{{end}}{{end}}.
{{- end}}

OpenAPI SHA: {{.Sha}}, Date: {{.Changed}}
{{- end}}{{if .DependencyUpdates}}
Dependency updates:
{{range .DependencyUpdates}}
* {{.}}.
{{- end -}}
{{end}}

## {{.PrevVersion}}

{{- define "what" -}}
{{if eq .X "package" -}}
`databricks.sdk.service.{{.Package.Name}}` package
{{- else if eq .X "service" -}}
{{template "service" .Service}}
{{- else if eq .X "method" -}}
`{{.Method.SnakeName}}()` method for {{template "service" .Method.Service}}
{{- else if eq .X "entity" -}}
{{template "entity" .Entity}} dataclass
{{- else if eq .X "field" -}}
`{{.Field.SnakeName}}` field for {{template "entity" .Field.Of}}
{{- end}}
{{- end -}}

{{- define "service" -}}
[{{if .IsAccounts}}a{{else}}w{{end}}.{{.SnakeName}}](https://databricks-sdk-py.readthedocs.io/en/latest/{{if .IsAccounts}}account{{else}}workspace{{end}}/{{.SnakeName}}.html) {{if .IsAccounts}}account{{else}}workspace{{end}}-level service
{{- end -}}

{{- define "entity" -}}
{{- if not . }}any /* ERROR */
{{- else if .IsEmpty}}`any`
{{- else if .PascalName}}`databricks.sdk.service.{{.Package.Name}}.{{.PascalName}}`
{{- else if .IsAny}}`any`
{{- else if .IsString}}`str`
{{- else if .IsBool}}`bool`
{{- else if .IsInt64}}`int`
{{- else if .IsFloat64}}`float`
{{- else if .IsInt}}`int`
{{- else if .ArrayValue }}list[{{template "entity" .ArrayValue}}]
{{- else if .MapValue }}dict[str,{{template "entity" .MapValue}}]
{{- else}}`databricks.sdk.service.{{.Package.Name}}.{{.PascalName}}`
{{- end -}}
{{- end -}}
4 changes: 2 additions & 2 deletions .codegen/example.py.tmpl
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ import time, base64, os
{{- else if eq .Type "lookup" -}}
{{template "expr" .X}}.{{.Field.SnakeName}}
{{- else if eq .Type "enum" -}}
{{.Package}}.{{.Entity.PascalName}}.{{.Content}}{{if eq .Content "None"}}_{{end}}
{{.Package}}.{{.Entity.PascalName}}.{{.ConstantName}}
{{- else if eq .Type "variable" -}}
{{if eq .SnakeName "true"}}True
{{- else if eq .SnakeName "false"}}False
Expand Down Expand Up @@ -109,4 +109,4 @@ f'/Users/{w.current_user.me().user_name}/sdk-{time.time_ns()}'
{{- else -}}
{{.SnakeName}}({{range $i, $x := .Args}}{{if $i}}, {{end}}{{template "expr" .}}{{end}})
{{- end -}}
{{- end}}
{{- end}}
12 changes: 6 additions & 6 deletions .codegen/service.py.tmpl
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ class {{.PascalName}}{{if eq "List" .PascalName}}Request{{end}}:{{if .Descriptio
{{else if .Enum}}class {{.PascalName}}(Enum):
{{if .Description}}"""{{.Comment " " 100 | trimSuffix "\"" }}"""{{end}}
{{range .Enum }}
{{.Content}}{{if eq .Content "None"}}_{{end}} = '{{.Content}}'{{end}}{{end}}
{{.ConstantName}} = '{{.Content}}'{{end}}{{end}}
{{end}}
{{- define "from_dict_type" -}}
{{- if not .Entity }}None
Expand Down Expand Up @@ -113,8 +113,8 @@ class {{.Name}}API:{{if .Description}}
def {{.SnakeName}}(self{{range .Binding}}, {{.PollField.SnakeName}}: {{template "type-nq" .PollField.Entity}}{{end}},
timeout=timedelta(minutes={{.Timeout}}), callback: Optional[Callable[[{{.Poll.Response.PascalName}}], None]] = None) -> {{.Poll.Response.PascalName}}:
deadline = time.time() + timeout.total_seconds()
target_states = ({{range .Success}}{{.Entity.PascalName}}.{{.Content}}, {{end}}){{if .Failure}}
failure_states = ({{range .Failure}}{{.Entity.PascalName}}.{{.Content}}, {{end}}){{end}}
target_states = ({{range .Success}}{{.Entity.PascalName}}.{{.ConstantName}}, {{end}}){{if .Failure}}
failure_states = ({{range .Failure}}{{.Entity.PascalName}}.{{.ConstantName}}, {{end}}){{end}}
status_message = 'polling...'
attempt = 1
while time.time() < deadline:
Expand Down Expand Up @@ -218,7 +218,7 @@ class {{.Name}}API:{{if .Description}}

{{define "method-call-paginated" -}}
{{if .Pagination.MultiRequest}}
{{if .Pagination.NeedsOffsetDedupe -}}
{{if .NeedsOffsetDedupe -}}
# deduplicate items that may have been added during iteration
seen = set()
{{- end}}{{if and .Pagination.Offset (not (eq .Path "/api/2.0/clusters/events")) }}
Expand All @@ -228,8 +228,8 @@ class {{.Name}}API:{{if .Description}}
if '{{.Pagination.Results.Name}}' not in json or not json['{{.Pagination.Results.Name}}']:
return
for v in json['{{.Pagination.Results.Name}}']:
{{if .Pagination.NeedsOffsetDedupe -}}
i = v['{{.Pagination.Entity.IdentifierField.Name}}']
{{if .NeedsOffsetDedupe -}}
i = v['{{.IdentifierField.Name}}']
if i in seen:
continue
seen.add(i)
Expand Down
21 changes: 19 additions & 2 deletions databricks/sdk/__init__.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

72 changes: 72 additions & 0 deletions databricks/sdk/_widgets/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
import logging
import typing
import warnings
from abc import ABC, abstractmethod


class WidgetUtils(ABC):

def get(self, name: str):
return self._get(name)

@abstractmethod
def _get(self, name: str) -> str:
pass

def getArgument(self, name: str, default_value: typing.Optional[str] = None):
try:
return self.get(name)
except Exception:
return default_value

def remove(self, name: str):
self._remove(name)

@abstractmethod
def _remove(self, name: str):
pass

def removeAll(self):
self._remove_all()

@abstractmethod
def _remove_all(self):
pass


try:
# We only use ipywidgets if we are in a notebook interactive shell otherwise we raise error,
# to fallback to using default_widgets. Also, users WILL have IPython in their notebooks (jupyter),
# because we DO NOT SUPPORT any other notebook backends, and hence fallback to default_widgets.
from IPython.core.getipython import get_ipython

# Detect if we are in an interactive notebook by iterating over the mro of the current ipython instance,
# to find ZMQInteractiveShell (jupyter). When used from REPL or file, this check will fail, since the
# mro only contains TerminalInteractiveShell.
if len(list(filter(lambda i: i.__name__ == 'ZMQInteractiveShell', get_ipython().__class__.__mro__))) == 0:
logging.debug("Not in an interactive notebook. Skipping ipywidgets implementation for dbutils.")
raise EnvironmentError("Not in an interactive notebook.")

# For import errors in IPyWidgetUtil, we provide a warning message, prompting users to install the
# correct installation group of the sdk.
try:
from .ipywidgets_utils import IPyWidgetUtil

widget_impl = IPyWidgetUtil
logging.debug("Using ipywidgets implementation for dbutils.")

except ImportError as e:
# Since we are certain that we are in an interactive notebook, we can make assumptions about
# formatting and make the warning nicer for the user.
warnings.warn(
"\nTo use databricks widgets interactively in your notebook, please install databricks sdk using:\n"
"\tpip install 'databricks-sdk[notebook]'\n"
"Falling back to default_value_only implementation for databricks widgets.")
logging.debug(f"{e.msg}. Skipping ipywidgets implementation for dbutils.")
raise e

except:
from .default_widgets_utils import DefaultValueOnlyWidgetUtils

widget_impl = DefaultValueOnlyWidgetUtils
logging.debug("Using default_value_only implementation for dbutils.")
42 changes: 42 additions & 0 deletions databricks/sdk/_widgets/default_widgets_utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
import typing

from . import WidgetUtils


class DefaultValueOnlyWidgetUtils(WidgetUtils):

def __init__(self) -> None:
self._widgets: typing.Dict[str, str] = {}

def text(self, name: str, defaultValue: str, label: typing.Optional[str] = None):
self._widgets[name] = defaultValue

def dropdown(self,
name: str,
defaultValue: str,
choices: typing.List[str],
label: typing.Optional[str] = None):
self._widgets[name] = defaultValue

def combobox(self,
name: str,
defaultValue: str,
choices: typing.List[str],
label: typing.Optional[str] = None):
self._widgets[name] = defaultValue

def multiselect(self,
name: str,
defaultValue: str,
choices: typing.List[str],
label: typing.Optional[str] = None):
self._widgets[name] = defaultValue

def _get(self, name: str) -> str:
return self._widgets[name]

def _remove(self, name: str):
del self._widgets[name]

def _remove_all(self):
self._widgets = {}
Loading

0 comments on commit 3f9bfb7

Please sign in to comment.