Promote Plugin Framework Share Resource to Production#4846
Conversation
173623d to
48c9baa
Compare
48c9baa to
50d60f8
Compare
50d60f8 to
32b9c70
Compare
32b9c70 to
adbea4e
Compare
adbea4e to
538b0fc
Compare
538b0fc to
a935de8
Compare
tanmay-db
left a comment
There was a problem hiding this comment.
Hi @rauchy, thanks for the PR, could you please add test to test the migration between SDKv2 and Plugin framework?
Since the resource is used by a lot of customers, we should make sure there are no behavior changes when migrating to the plugin framework.
https://developer.hashicorp.com/terraform/plugin/framework/migrating/testing#testing-migration
During migration, we recommend writing tests to verify that switching from SDKv2 to the Framework has not affected your provider's behavior. These tests use identical configuration. The first test step applies a plan and generates state with the SDKv2 version of the provider. The second test step generates a plan with the Framework version of the provider and verifies that the plan is a no-op, indicating that migrating to the framework has not altered behaviour.
|
Good idea @tanmay-db! Added in 21cd6ae. |
7fcaf83 to
21cd6ae
Compare
NEXT_CHANGELOG.md
Outdated
|
|
||
| * Added output attribute `endpoint_url` in `databricks_model_serving` ([#4877](https://github.com/databricks/terraform-provider-databricks/pull/4877)). | ||
| * Deprecate `egg` library type in `databricks_cluster`, `databricks_job`, and `databricks_library` ([#4881](https://github.com/databricks/terraform-provider-databricks/pull/4881)). | ||
| * Added output attribute `endpoint_url` in `databricks_model_serving`([#4877](https://github.com/databricks/terraform-provider-databricks/pull/4877)). |
There was a problem hiding this comment.
This is a duplicate of the line number 9
exporter/importables.go
Outdated
| switch obj.DataObjectType { | ||
|
|
||
| // Get objects array from the resource data | ||
| objectsList := r.Data.Get("object").([]any) |
There was a problem hiding this comment.
r.Data will still be using the SDKv2 implementation under the hood, so I'm not sure if there is a benefit of using .Get to get raw data instead of doing common.DataToStructPointer
21cd6ae to
7594fee
Compare
7594fee to
4624e75
Compare
… of the share resource
846b7e3 to
5a57961
Compare
|
If integration tests don't run automatically, an authorized user can run them manually by following the instructions below: Trigger: Inputs:
Checks will be approved automatically on success. |
## Release v1.86.0 ### Breaking Changes * Do not set `run_as from run_as_user_name in DLT pipelines. This fixes an issue where the value for run_as was unintentionally cached in the terraform state. More details and the workaround are specified in the PR: ([#4886](#4886)) ### New Features and Improvements * Don't redeploy `databricks_sql_table` for tables with struct subcolumns ([#4001](#4001)). * Added ability to add `comment` when defining a new `databricks_share` ([#4802](#4802)) * Added output attribute `endpoint_url` in `databricks_model_serving` ([#4877](#4877)). * Deprecate `egg` library type in `databricks_cluster`, `databricks_job`, and `databricks_library` ([#4881](#4881)). * Support `databricks_service_principal_secret` on workspace level ([#4896](#4896)). * Added resources and data sources for `databricks_clean_room_asset`, `databricks_clean_room_auto_approval_rule` and `databricks_clean_room_asset_revisions_clean_room_asset` ([#4907](#4907)). ### Bug Fixes * Corrected accidentally removed `SpID` field from `databricks_service_principal` ([#4868](#4868)). * Corrected optional fields in `databricks_mws_ncc_private_endpoint_rule`([#4856](#4856)). * Fix handling of `force` option in `databricks_git_credential` ([#4873](#4873)). * Restricted create or replace statement to managed tables in `databricks_sql_table`([#4874](#4874)). * Mitigate issue due to internal caching in `databricks_secret_acl` by retrying until ACL are applied with the right permission ([#4885](#4885)). * Fix schema mismatch bug in `databricks_functions` data source ([#4902](#4902)). * Set suppressDiff on string_shared_as in the Legacy databricks_share Resource ([#4904](#4904)). ### Documentation * Updated `share` documentation to be more in line with Terraform styling ([#4802](#4802)) * Refreshed `databricks_job` documentation ([#4861](#4861)). * Document `environment` block in `databricks_pipeline` ([#4878](#4878)). * Updated documentation for `databricks_disable_legacy_dbfs_setting` resource ([#4870](#4870)). * Add deprecation notice to `databricks_dbfs_file` and `databricks_mount` ([#4876](#4876)) * Updated documentation for `databricks_disable_legacy_features_setting` resource ([#4884](#4884)). * Improve docs for `databricks_compliance_security_profile_setting` ([#4880](#4880)). * Improve instructions for the Terraform Exporter ([#4892](#4892)). * Improve documentation for service principal data sources ([#4900](#4900)). * Add warning about disabling legacy features and default catalog ([#4905](#4905)). * Improve documentation for grants resources ([#4906](#4906)) ### Exporter * Added support for exporting of workspaces and related resources ([#4899](#4899)). ### Internal Changes * Promote Plugin Framework Share Resource to Production ([#4846](#4846)). * Update Go SDK to v0.79.0 ([#4907](#4907)).
…s default (#4931) ## Changes <!-- Summary of your changes that are easy to understand --> This PR: - reverts #4846. - Adds a note under breaking changes to highlight the issue. - Move the next release to patch. **Context** databricks_share resource exists in 2 implementations SDKv2 and Plugin framework. In #4846, plugin framework implementation (`databricks_share_pluginframework`) was made default (`databricks_share`) but this lead to surfacing an underlying issue in plugin framework implementation leading to panic: #4913 ``` Stack trace from the terraform-provider-databricks_v1.86.0 plugin: panic: runtime error: index out of range [0] with length 0 goroutine 68 [running]: github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing.(*ShareResource).syncEffectiveFields(_, {_, _}, {{{0x2, {0xc0007f6b40, 0x22}}, {0x2, 0x196f67034f0}, {0x2, {0xc0007f6b70, ...}}, ...}}, ...) /home/runner/work/terraform-provider-databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing/resource_share.go:449 +0x48e github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing.(*ShareResource).Read(0xc000527240, {0x1c50398?, 0xc000a31890?}, {{{{0x1c5b670, 0xc000a4c150}, {0x16fd920, 0xc000a41e30}}, {0x1c61390, 0xc0009b1ef0}}, 0x0, ...}, ...) /home/runner/work/terraform-provider-databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing/resource_share.go:281 +0x765 ``` This leads to the default share resource having unstable implementation. Panic can lead to bad state in TF since apply is stopped in between. This change would be followed up by a patch release. - No changes for users on version 1.85.0 and below which would upgrade to patch release including this fix. Limit the impacted users by moving back to SDKv2. Impacted users with the issue of panic seems to have been able to mitigate the issue by using older version of TF which uses SDKv2. However, I am getting diff to create the share again. It seems like SDKv2 is not able to use the state create by plugin framework. This needs further investigation. To limit the impact of issue, we are rolling back to the previous stable version. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Tests same as when SDKv2 was default.
Changes
This PR promotes the plugin framework implementation of the
databricks_shareresource to be the default implementation, replacing the SDK v2 version. You can now usedatabricks_shareto access the plugin framework implementation, with an opt-out mechanism available for those who wish to continue using the legacy SDK v2 version.Rollback Instructions
If you encounter issues with the new plugin framework implementation, you can temporarily revert to the SDK v2 implementation by setting an environment variable:
export USE_SDK_V2_RESOURCES=databricks_shareFor data sources, use:
export USE_SDK_V2_DATA_SOURCES=databricks_share,databricks_sharesTests
Updated all test files to use
databricks_shareinstead ofdatabricks_share_pluginframework.make testrun locallydocs/folderinternal/acceptanceNEXT_CHANGELOG.mdfile