[BUG] KeyError: 'sparkConf' occurs when running a Databricks task without spark_conf #3263
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Tracking issue
Closes flyteorg/flyte#6471
Why are the changes needed?
Using Databricks connector without setting optional
spark_conffield will causeKeyError: 'sparkConf'What changes were proposed in this pull request?
If
sparkConfis empty, use{}How was this patch tested?
Added unit test for DatabricksV2 with and without
spark_confSetup process
Screenshots
Check all the applicable boxes
Related PRs
Docs link
Summary by Bito
This pull request resolves a KeyError in Databricks tasks by using an empty dictionary when the optional spark_conf field is absent. It also includes unit tests for the new DatabricksV2 configuration to ensure functionality in both scenarios.