-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merge main into dev/sdk-mod #936
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
NO_CHANGELOG=true ## What changes are proposed in this pull request? Sync automated tagging files. This include a bug fix where the config for version value location was being loaded from the wrong file. ## How is this tested? N/A Co-authored-by: Renaud Hartert <[email protected]>
## What changes are proposed in this pull request? This PR cleans up the `setup.cfg` file by removing unnecessary YAPF formatting configuration. Since the project is now using Black for formatting, these YAPF-specific configurations were redundant and potentially confusing. ## How is this tested? Existing Test NO_CHANGELOG=true
## What changes are proposed in this pull request? Update Python SDK with the latest OpenAPI Spec. ## How is this tested? Existing tests.
## What changes are proposed in this pull request? Add support for async token refresh: * Add an experimental setting to enable async token refresh. * Regenerate SDK with latest templates which use the new DataPlane token source. * Removed old DataPlane code. This allows users to use DataPlane endpoints without a performance penalty during token refresh. ## How is this tested? Added integration test.
## Release v0.46.0 ### New Features and Improvements * [Experimental] Add support for async token refresh ([#916](#916)). This can be enabled with by setting the following setting: ``` export DATABRICKS_ENABLE_EXPERIMENTAL_ASYNC_TOKEN_REFRESH=1. ``` This feature and its setting are experimental and may be removed in future releases. ### API Changes * Added [w.forecasting](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/ml/forecasting.html) workspace-level service. * Added `statement_id` field for `databricks.sdk.service.dashboards.GenieQueryAttachment`. * Added `could_not_get_model_deployments_exception` enum value for `databricks.sdk.service.dashboards.MessageErrorType`. * [Breaking] Removed `jwks_uri` field for `databricks.sdk.service.oauth2.OidcFederationPolicy`.
## What changes are proposed in this pull request? This PR updates the version constant that the automated tagging workflow failed to bump. ## How is this tested? N/A NO_CHANGELOG=true
Updates the requirements on [ipython](https://github.com/ipython/ipython) to permit the latest version. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/ipython/ipython/commit/d64897cf070ac6f8d75d84fd31f7ddccc1c7adfc"><code>d64897c</code></a> release 9.0.1</li> <li><a href="https://github.com/ipython/ipython/commit/77be8351c92adcb6fc00b77e5fa98bed2ec8929f"><code>77be835</code></a> Partial readd of termcolors to please ipyparallel. (<a href="https://redirect.github.com/ipython/ipython/issues/14814">#14814</a>)</li> <li><a href="https://github.com/ipython/ipython/commit/343d43da35d587cd95856e2f5582e6eac53e3b57"><code>343d43d</code></a> Try to fix <a href="https://redirect.github.com/ipython/ipython/issues/4811">#4811</a> (crash ipdb) (<a href="https://redirect.github.com/ipython/ipython/issues/14813">#14813</a>)</li> <li><a href="https://github.com/ipython/ipython/commit/b28af9b132d2f7173b3af00b58ee02f5b45309a2"><code>b28af9b</code></a> Partial readd of termcolors to please ipyparallel.</li> <li><a href="https://github.com/ipython/ipython/commit/b82061c2d6a4273f1b9f6245b442a5c4d637a27d"><code>b82061c</code></a> Try to fix <a href="https://redirect.github.com/ipython/ipython/issues/4811">#4811</a>.</li> <li><a href="https://github.com/ipython/ipython/commit/d627d00585e9bba697caf8155ebcb7fe70e0a181"><code>d627d00</code></a> 9.x autoreload fix (<a href="https://redirect.github.com/ipython/ipython/issues/14807">#14807</a>)</li> <li><a href="https://github.com/ipython/ipython/commit/ec6f9f72bb64cb8854e5657b6b84fce85423e52f"><code>ec6f9f7</code></a> fix: add <strong>init</strong> so dedupereload is properly included in build</li> <li><a href="https://github.com/ipython/ipython/commit/f5661a01d16a76e32bbc66791c8957e091759a53"><code>f5661a0</code></a> back to dev</li> <li><a href="https://github.com/ipython/ipython/commit/7bcd52a30d88716a868f1ce7eb5a32c8b6692125"><code>7bcd52a</code></a> release 9.0.0</li> <li><a href="https://github.com/ipython/ipython/commit/9e067fb012bd43d5500b2d04544f754bb1839da9"><code>9e067fb</code></a> update what's new version 9 + windows fixes (<a href="https://redirect.github.com/ipython/ipython/issues/14805">#14805</a>)</li> <li>Additional commits viewable in <a href="https://github.com/ipython/ipython/compare/8.0.0...9.0.1">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> NO_CHANGELOG=true Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Renaud Hartert <[email protected]>
## What changes are proposed in this pull request? Update codegen to match new style. ## How is this tested? Not tested. NO_CHANGELOG=true
…ser` credentials strategy (#931) ## What changes are proposed in this pull request? Today, the `external-browser` credentials strategy allows users to dynamically fetch an OAuth token without depending on external tools like the Databricks CLI. However, the requested scope is hard-coded to always be `all-apis`. As a result, after successfully authenticating, the authorization server will return only an access token and no refresh token. The access token will expire after an hour, and attempts to refresh the token will fail. This PR adds the `offline_access` scope to the default requested scopes during this flow. This matches the request made by the CLI in the `databricks auth login` flow. The resulting token includes a refresh token. ## How is this tested? I manually tested this by deleting the token cache at `~/.config/databricks-sdk-py/oauth` and then running the following script: ```py from databricks.sdk import WorkspaceClient w = WorkspaceClient(host="https://adb-2548836972759138.18.azuredatabricks.net", auth_type="external-browser") print(w.current_user.me()) ``` The resulting token is saved to the OAuth cache. This token includes the refresh token. ``` {"token": {"access_token": "eyJraWQiOiIy...", "token_type": "Bearer", "expiry": "2025-03-20T11:00:25.242011", "refresh_token": "doau6c1a..."}} ``` To force a refresh, I manually updated the expiry time to be in the past and reran the script. The token was refreshed and the script succeeded: ``` DEBUG:databricks.sdk.oauth:Retrieving token for databricks-cli DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): adb-2548836972759138.18.azuredatabricks.net:443 DEBUG:urllib3.connectionpool:https://adb-2548836972759138.18.azuredatabricks.net:443 "POST /oidc/v1/token HTTP/1.1" 200 None ```
## What changes are proposed in this pull request? Update the Python SDK to the latest commit of the Databricks REST API. ## How is this tested? Existing tests.
## Release v0.47.0 ### Bug Fixes * Ensure that refresh tokens are returned when using the `external-browser` credentials strategy. ### API Changes * Added `abfss`, `dbfs`, `error_message`, `execution_duration_seconds`, `file`, `gcs`, `s3`, `status`, `volumes` and `workspace` fields for `databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails`. * [Breaking] Added `forecast_granularity` field for `databricks.sdk.service.ml.CreateForecastingExperimentRequest`. * Added `jwks_uri` field for `databricks.sdk.service.oauth2.OidcFederationPolicy`. * Added `fallback_config` field for `databricks.sdk.service.serving.AiGatewayConfig`. * Added `custom_provider_config` field for `databricks.sdk.service.serving.ExternalModel`. * Added `fallback_config` field for `databricks.sdk.service.serving.PutAiGatewayRequest`. * Added `fallback_config` field for `databricks.sdk.service.serving.PutAiGatewayResponse`. * Added `aliases`, `comment`, `data_type`, `dependency_list`, `full_data_type`, `id`, `input_params`, `name`, `properties`, `routine_definition`, `schema`, `securable_kind`, `share`, `share_id`, `storage_location` and `tags` fields for `databricks.sdk.service.sharing.DeltaSharingFunction`. * Added `access_token_failure`, `allocation_timeout`, `allocation_timeout_node_daemon_not_ready`, `allocation_timeout_no_healthy_clusters`, `allocation_timeout_no_matched_clusters`, `allocation_timeout_no_ready_clusters`, `allocation_timeout_no_unallocated_clusters`, `allocation_timeout_no_warmed_up_clusters`, `aws_inaccessible_kms_key_failure`, `aws_instance_profile_update_failure`, `aws_invalid_key_pair`, `aws_invalid_kms_key_state`, `aws_resource_quota_exceeded`, `azure_packed_deployment_partial_failure`, `bootstrap_timeout_due_to_misconfig`, `budget_policy_limit_enforcement_activated`, `budget_policy_resolution_failure`, `cloud_account_setup_failure`, `cloud_operation_cancelled`, `cloud_provider_instance_not_launched`, `cloud_provider_launch_failure_due_to_misconfig`, `cloud_provider_resource_stockout_due_to_misconfig`, `cluster_operation_throttled`, `cluster_operation_timeout`, `control_plane_request_failure_due_to_misconfig`, `data_access_config_changed`, `disaster_recovery_replication`, `driver_eviction`, `driver_launch_timeout`, `driver_node_unreachable`, `driver_out_of_disk`, `driver_out_of_memory`, `driver_pod_creation_failure`, `driver_unexpected_failure`, `dynamic_spark_conf_size_exceeded`, `eos_spark_image`, `executor_pod_unscheduled`, `gcp_api_rate_quota_exceeded`, `gcp_forbidden`, `gcp_iam_timeout`, `gcp_inaccessible_kms_key_failure`, `gcp_insufficient_capacity`, `gcp_ip_space_exhausted`, `gcp_kms_key_permission_denied`, `gcp_not_found`, `gcp_resource_quota_exceeded`, `gcp_service_account_access_denied`, `gcp_service_account_not_found`, `gcp_subnet_not_ready`, `gcp_trusted_image_projects_violated`, `gke_based_cluster_termination`, `init_container_not_finished`, `instance_pool_max_capacity_reached`, `instance_pool_not_found`, `instance_unreachable_due_to_misconfig`, `internal_capacity_failure`, `invalid_aws_parameter`, `invalid_instance_placement_protocol`, `invalid_worker_image_failure`, `in_penalty_box`, `lazy_allocation_timeout`, `maintenance_mode`, `netvisor_setup_timeout`, `no_matched_k8s`, `no_matched_k8s_testing_tag`, `pod_assignment_failure`, `pod_scheduling_failure`, `resource_usage_blocked`, `secret_creation_failure`, `serverless_long_running_terminated`, `spark_image_download_throttled`, `spark_image_not_found`, `ssh_bootstrap_failure`, `storage_download_failure_due_to_misconfig`, `storage_download_failure_slow`, `storage_download_failure_throttled`, `unexpected_pod_recreation`, `user_initiated_vm_termination` and `workspace_update` enum values for `databricks.sdk.service.compute.TerminationReasonCode`. * Added `generated_sql_query_too_long_exception` and `missing_sql_query_exception` enum values for `databricks.sdk.service.dashboards.MessageErrorType`. * Added `balanced` enum value for `databricks.sdk.service.jobs.PerformanceTarget`. * Added `listing_resource` enum value for `databricks.sdk.service.marketplace.FileParentType`. * Added `app` enum value for `databricks.sdk.service.marketplace.MarketplaceFileType`. * Added `custom` enum value for `databricks.sdk.service.serving.ExternalModelProvider`. * [Breaking] Changed `create_experiment()` method for [w.forecasting](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/ml/forecasting.html) workspace-level service with new required argument order. * Changed `instance_type_id` field for `databricks.sdk.service.compute.NodeInstanceType` to be required. * Changed `category` field for `databricks.sdk.service.compute.NodeType` to be required. * [Breaking] Changed `functions` field for `databricks.sdk.service.sharing.ListProviderShareAssetsResponse` to type `databricks.sdk.service.sharing.DeltaSharingFunctionList` dataclass. * [Breaking] Changed waiter for [ClustersAPI.create](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.create) method. * [Breaking] Changed waiter for [ClustersAPI.delete](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.delete) method. * [Breaking] Changed waiter for [ClustersAPI.edit](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.edit) method. * [Breaking] Changed waiter for [ClustersAPI.get](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.get) method. * [Breaking] Changed waiter for [ClustersAPI.resize](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.resize) method. * [Breaking] Changed waiter for [ClustersAPI.restart](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.restart) method. * [Breaking] Changed waiter for [ClustersAPI.start](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.start) method. * [Breaking] Changed waiter for [ClustersAPI.update](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.update) method. * [Breaking] Removed `execution_details` and `script` fields for `databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails`. * [Breaking] Removed `supports_elastic_disk` field for `databricks.sdk.service.compute.NodeType`. * [Breaking] Removed `data_granularity_quantity` and `data_granularity_unit` fields for `databricks.sdk.service.ml.CreateForecastingExperimentRequest`. * [Breaking] Removed `aliases`, `comment`, `data_type`, `dependency_list`, `full_data_type`, `id`, `input_params`, `name`, `properties`, `routine_definition`, `schema`, `securable_kind`, `share`, `share_id`, `storage_location` and `tags` fields for `databricks.sdk.service.sharing.Function`.
## What changes are proposed in this pull request? Integration tests for the Python SDK are run once for each environment (a grid covering cloud, account/workspace, and with or without UC). All tests are executed by the test runner in each environment, even though most tests only work in a specific context (e.g. you can't run a test that depends on a catalog in a workspace with no catalog). Each test uses pytest fixtures to express its dependencies. One of these is the Databricks client to use for the test. There are four options, depending on whether the test is meant to be run at the workspace/account level and with or without UC: `w` for workspace-level without UC, `ucws` for workspace-level with UC, `a` for account-level without UC, and `ucacct` for account-level with UC. Any test that uses these clients in the `tests/integration` directory will have a marker called `integration` automatically added to it. `make integration` runs these tests. However, currently, tests using the `ucacct` client are skipped. In this case, this is happening by pytest itself because these tests are not annotated with the `integration` marker. This is fixed by adding `ucacct` to the list of fixtures whose presence indicates that they are integration tests. ## How is this tested? - [ ] Run integration tests, and verify that ucacct tests are executed after this change. NO_CHANGELOG=true
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes are proposed in this pull request?
This PR merges main into dev/sdk-mod.
How is this tested?
Existing tests