-
Notifications
You must be signed in to change notification settings - Fork 409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds databricks_volume
as data source
#3211
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add docs as well
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #3211 +/- ##
==========================================
+ Coverage 82.55% 82.56% +0.01%
==========================================
Files 186 188 +2
Lines 19131 19177 +46
==========================================
+ Hits 15794 15834 +40
- Misses 2411 2415 +4
- Partials 926 928 +2
|
Added docs, tests and resolved review comments. |
Formatting fixed. |
Small refactor to take advantage of #3207 and avoid having to add new functions in Tests and docs added. |
@alexott the requested changes were addressed. Please let me know if you see anything else outstanding |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general looks good, just one comment
…cks into feature/data-volume
Update: Merged the latest changes from main and made adjustments for the latest version of GO SDK. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
relevant integration test has passed
@nkvuong @mgyucht I think that we need to discuss the unified approach to the data sources implementation - the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We discussed this in the slack thread (mentioning here for visibility) but since most of the data sources use nested structure, I think we should use that here as well for consistency.
The changes were made to comply with the approach of having data sources as nested structures. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small changes are still requiired
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
integration tests are failing... I think that it's ok to remove that check to match test in databricks_table
data source: https://github.com/databricks/terraform-provider-databricks/blob/main/internal/acceptance/data_table_test.go#L11
Integration test passed |
thank you for contribution @karolusz ! |
### Internal Changes * Add Release tag ([#3748](#3748)). * Improve Changelog by grouping changes ([#3747](#3747)). * Upgrade Go SDK to v0.43.2 ([#3750](#3750)). ### Other Changes * Add `databricks_schema` data source ([#3732](#3732)). * Add new APIErrorBody struct and update deps ([#3745](#3745)). * Added support for binding storage credentials and external locations to specific workspaces ([#3678](#3678)). * Adds `databricks_volume` as data source ([#3211](#3211)). * Change TF registry ownership ([#3736](#3736)). * Exporter: Emit directories during the listing only if they are explicitly configured in `-listing` ([#3673](#3673)). * Exporter: export libraries specified as `requirements.txt` ([#3649](#3649)). * Exporter: fix generation of `run_as` blocks in `databricks_job` ([#3724](#3724)). * Exporter: use Go SDK structs for `databricks_job` resource ([#3727](#3727)). * Fix invalid priviledges in grants.md ([#3716](#3716)). * Make the schedule.pause_status field read-only ([#3692](#3692)). * Refactored `databricks_cluster(s)` data sources to Go SDK ([#3685](#3685)). * Renamed `databricks_catalog_workspace_binding` to `databricks_workspace_binding` ([#3703](#3703)). * Run goreleaser action in snapshot mode from merge queue ([#3646](#3646)). * Update cluster.md: add data_security_mode parameters `NONE` and `NO_ISOLATION` ([#3740](#3740)). * Upgrade databricks-sdk-go ([#3743](#3743)). * remove references to basic auth ([#3720](#3720)).
## 1.49.0 ### New Features and Improvements * Added `databricks_dashboard` resource ([#3729](#3729)). * Added `databricks_schema` data source ([#3732](#3732)). * Added support for binding storage credentials and external locations to specific workspaces ([#3678](#3678)). * Added `databricks_volume` as data source ([#3211](#3211)). * Make the `schedule.pause_status` field read-only ([#3692](#3692)). * Renamed `databricks_catalog_workspace_binding` to `databricks_workspace_binding` ([#3703](#3703)). * Make `cluster_name_contains` optional in `databricks_clusters` data source ([#3760](#3760)). * Tolerate OAuth errors in databricks_mws_workspaces when managing tokens ([#3761](#3761)). * Permissions for `databricks_dashboard` resource ([#3762](#3762)). ### Exporter * Emit directories during the listing only if they are explicitly configured in `-listing` ([#3673](#3673)). * Export libraries specified as `requirements.txt` ([#3649](#3649)). * Fix generation of `run_as` blocks in `databricks_job` ([#3724](#3724)). * Use Go SDK structs for `databricks_job` resource ([#3727](#3727)). * Clarify use of `-listing` and `-services` options ([#3755](#3755)). * Improve code generation for SQL Endpoints ([#3764](#3764)) ### Documentation * Fix invalid priviledges in grants.md ([#3716](#3716)). * Update cluster.md: add data_security_mode parameters `NONE` and `NO_ISOLATION` ([#3740](#3740)). * Remove references to basic auth ([#3720](#3720)). * Update resources diagram ([#3765](#3765)). ### Internal Changes * Add Release tag ([#3748](#3748)). * Improve Changelog by grouping changes ([#3747](#3747)). * Change TF registry ownership ([#3736](#3736)). * Refactored `databricks_cluster(s)` data sources to Go SDK ([#3685](#3685)). * Upgrade databricks-sdk-go ([#3743](#3743)). * Run goreleaser action in snapshot mode from merge queue ([#3646](#3646)). * Make `dashboard_name` random in integration tests for `databricks_dashboard` resource ([#3763](#3763)). * Clear stale go.sum values ([#3768](#3768)). * Add "Owner" tag to test cluster in acceptance test ([#3771](#3771)). * Fix integration test for restrict workspace admins setting ([#3772](#3772)). * Add "Owner" tag to test SQL endpoint in acceptance test ([#3774](#3774)). * Move PR message validation to a separate workflow ([#3777](#3777)). * Trigger the validate workflow in the merge queue ([#3782](#3782)). * Update properties for managed SQL table on latest DBR ([#3784](#3784)). * Add "Owner" tag to test SQL endpoint in acceptance test ([#3785](#3785)).
Changes
Adds
databricks_volume
data source.The data source takes the volume's
full_name
oras user input.catalog_name
,schema_name
,name
I tried to use the schema customization from
common.customizable_schema.go
when defining data source's parameters. The following changes were made incommon/resource.go
as none of the function wrapping aroundgenericDatabricksData
and thegenericDatabricksData
itself was not allowing to pass a customization function:-genericDatabricksData
now takes a schema customization function for customizingotherFields
which is overlaid on top SDK schema.- a new functionWorkspaceDataWithCustomParams
was added that wrapsgenericDatabricksData
for cases where a schema customization function is desired.-WorkspaceData
,WorkspaceDataWithParams
,AccountData
,AccountDataWithParams
were adjusted to passNoCustomize
to the updatedgenericDatabricksData
The above changes were taken out because of #3207
Tests
make test
run locallydocs/
folderinternal/acceptance