@@ -19,6 +19,9 @@ github.com/databricks/cli/bundle/config/resources.App:
19
19
" description " :
20
20
"description" : |-
21
21
The description of the app.
22
+ " id " :
23
+ "description" : |-
24
+ The unique identifier of the app.
22
25
" name " :
23
26
"description" : |-
24
27
The name of the app. The name must contain only lowercase alphanumeric characters and hyphens.
@@ -67,7 +70,7 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
67
70
" cluster_log_conf " :
68
71
"description" : |-
69
72
The configuration for delivering spark logs to a long-term storage destination.
70
- Two kinds of destinations (dbfs and s3 ) are supported. Only one destination can be specified
73
+ Three kinds of destinations (DBFS, S3 and Unity Catalog volumes ) are supported. Only one destination can be specified
71
74
for one cluster. If the conf is given, the logs will be delivered to the destination every
72
75
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
73
76
the destination of executor logs is `$destination/$clusterId/executor`.
@@ -1009,6 +1012,10 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterLogConf:
1009
1012
`{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }`
1010
1013
Cluster iam role is used to access s3, please make sure the cluster iam role in
1011
1014
`instance_profile_arn` has permission to write data to the s3 destination.
1015
+ " volumes " :
1016
+ "description" : |-
1017
+ destination needs to be provided. e.g.
1018
+ `{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`
1012
1019
github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec :
1013
1020
" apply_policy_default_values " :
1014
1021
"description" : |-
@@ -1034,7 +1041,7 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
1034
1041
" cluster_log_conf " :
1035
1042
"description" : |-
1036
1043
The configuration for delivering spark logs to a long-term storage destination.
1037
- Two kinds of destinations (dbfs and s3 ) are supported. Only one destination can be specified
1044
+ Three kinds of destinations (DBFS, S3 and Unity Catalog volumes ) are supported. Only one destination can be specified
1038
1045
for one cluster. If the conf is given, the logs will be delivered to the destination every
1039
1046
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
1040
1047
the destination of executor logs is `$destination/$clusterId/executor`.
@@ -1428,7 +1435,7 @@ github.com/databricks/databricks-sdk-go/service/compute.S3StorageInfo:
1428
1435
github.com/databricks/databricks-sdk-go/service/compute.VolumesStorageInfo :
1429
1436
" destination " :
1430
1437
"description" : |-
1431
- Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh `
1438
+ Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file `
1432
1439
github.com/databricks/databricks-sdk-go/service/compute.WorkloadType :
1433
1440
" clients " :
1434
1441
"description" : |2-
@@ -2985,7 +2992,7 @@ github.com/databricks/databricks-sdk-go/service/serving.ExternalModel:
2985
2992
PaLM Config. Only required if the provider is 'palm'.
2986
2993
" provider " :
2987
2994
"description" : |-
2988
- The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', and 'palm '.
2995
+ The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', 'palm', and 'custom '.
2989
2996
" task " :
2990
2997
"description" : |-
2991
2998
The task type of the external model.
0 commit comments