Skip to content

dependency check, set maxJDKversion to 17. (due to hadoop upgrade) #33

dependency check, set maxJDKversion to 17. (due to hadoop upgrade)

dependency check, set maxJDKversion to 17. (due to hadoop upgrade) #33

Triggered via push November 13, 2024 15:44
Status Cancelled
Total duration 1h 26m 43s
Artifacts 10

build_main.yml

on: push
Run  /  Base image build
50s
Run / Base image build
Run  /  Breaking change detection with Buf (branch-3.4)
2m 38s
Run / Breaking change detection with Buf (branch-3.4)
Run  /  Scala 2.13 build with SBT
18m 2s
Run / Scala 2.13 build with SBT
Run  /  Run TPC-DS queries with SF=1
50m 8s
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
31m 6s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
59m 37s
Run / Run Spark on Kubernetes Integration test
Matrix: Run / build
Matrix: Run / java-11-17
Run  /  Build modules: sparkr
14s
Run / Build modules: sparkr
Run  /  Linters, licenses, dependencies and documentation generation
20s
Run / Linters, licenses, dependencies and documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

47 errors and 37 warnings
Run / Base image build
The process '/usr/bin/docker' failed with exit code 1
Run / Build modules: pyspark-errors
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-errors
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-errors
Docker pull failed with exit code 1
Run / Build modules: pyspark-pandas-slow
Docker pull failed with exit code 1
Run / Build modules: pyspark-pandas-slow
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas-slow
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas-connect
Docker pull failed with exit code 1
Run / Build modules: pyspark-pandas-connect
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas-connect
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-sql, pyspark-mllib, pyspark-resource, pyspark-testing
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-sql, pyspark-mllib, pyspark-resource, pyspark-testing
Docker pull failed with exit code 1
Run / Build modules: pyspark-sql, pyspark-mllib, pyspark-resource, pyspark-testing
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas
Docker pull failed with exit code 1
Run / Build modules: pyspark-pandas
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: sparkr
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: sparkr
Docker pull failed with exit code 1
Run / Build modules: pyspark-connect
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-connect
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-connect
Docker pull failed with exit code 1
Run / Build modules: pyspark-pandas-slow-connect
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas-slow-connect
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-pandas-slow-connect
Docker pull failed with exit code 1
Run / Linters, licenses, dependencies and documentation generation
Docker pull failed with exit code 1
Run / Build modules: pyspark-core, pyspark-streaming, pyspark-ml
Docker pull failed with exit code 1
Run / Build modules: pyspark-core, pyspark-streaming, pyspark-ml
Value cannot be null. (Parameter 'ContainerId')
Run / Build modules: pyspark-core, pyspark-streaming, pyspark-ml
Value cannot be null. (Parameter 'ContainerId')
Run / Run Spark on Kubernetes Integration test
Set() did not contain "decomtest-f3da4893264a9e56-exec-1".
Run / Run Spark on Kubernetes Integration test
Set() did not contain "decomtest-80fbb693264b7b7e-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$544/1630211971@48b93123 rejected from java.util.concurrent.ThreadPoolExecutor@1969b01d[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 362]
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$544/1630211971@23dc249 rejected from java.util.concurrent.ThreadPoolExecutor@1969b01d[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 363]
Run / Run Spark on Kubernetes Integration test
Set() did not contain "decomtest-28eefe93265e1e73-exec-1".
Run / Run Spark on Kubernetes Integration test
Set() did not contain "decomtest-92c52993265f023e-exec-1".
Run / Run Spark on Kubernetes Integration test
Set() did not contain "decomtest-8cadaa9326629cdc-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-55ff262281da4fda86a5af80e9b2ac8d-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-55ff262281da4fda86a5af80e9b2ac8d-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
RandomForestRegressorSuite.training with sample weights: org/apache/spark/ml/regression/RandomForestRegressorSuite#L187
org.apache.spark.SparkException: Job 285 cancelled because SparkContext was shut down
RandomForestRegressorSuite.read/write: org/apache/spark/ml/regression/RandomForestRegressorSuite#L221
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:141) org.apache.spark.ml.util.MLTest.createSparkSession(MLTest.scala:51) org.apache.spark.ml.util.MLTest.createSparkSession$(MLTest.scala:50) org.apache.spark.ml.regression.RandomForestRegressorSuite.createSparkSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession(SharedSparkSession.scala:116) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession$(SharedSparkSession.scala:114) org.apache.spark.ml.regression.RandomForestRegressorSuite.initializeSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll(SharedSparkSession.scala:124) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll$(SharedSparkSession.scala:123) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$sql$test$SharedSparkSession$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSession.beforeAll(SharedSparkSession.scala:46) org.apache.spark.sql.test.SharedSparkSession.beforeAll$(SharedSparkSession.scala:44) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$TempDirectory$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.TempDirectory.beforeAll(TempDirectory.scala:39) org.apache.spark.ml.util.TempDirectory.beforeAll$(TempDirectory.scala:38) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$MLTest$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.MLTest.beforeAll(MLTest.scala:55) org.apache.spark.ml.util.MLTest.beforeAll$(MLTest.scala:54) org.apache.spark.ml.regression.RandomForestRegressorSuite.beforeAll(RandomForestRegressorSuite.scala:46) org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:141) org.apache.spark.ml.util.MLTest.createSparkSession(MLTest.scala:51) org.apache.spark.ml.util.MLTest.createSparkSession$(MLTest.scala:50) org.apache.spark.ml.regression.RandomForestRegressorSuite.createSparkSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession(SharedSparkSession.scala:116) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession$(SharedSparkSession.scala:114) org.apache.spark.ml.regression.RandomForestRegressorSuite.initializeSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll(SharedSparkSession.scala:124) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll$(SharedSparkSession.scala:123) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$sql$test$SharedSparkSession$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSession.beforeAll(SharedSparkSession.scala:46) org.apache.spark.sql.test.SharedSparkSession.beforeAll$(SharedSparkSession.scala:44) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$TempDirectory$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.TempDirectory.beforeAll(TempDirectory.scala:39) org.apache.spark.ml.util.TempDirectory.beforeAll$(TempDirectory.scala:38) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$MLTest$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.MLTest.beforeAll(MLTest.scala:55) org.apache.spark.ml.util.MLTest.beforeAll$(MLTest.scala:54) org.apache.spark.ml.regression.RandomForestRegressorSuite.beforeAll(RandomForestRegressorSuite.scala:46) org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
RandomForestRegressorSuite.SPARK-33398: Load RandomForestRegressionModel prior to Spark 3.0: org/apache/spark/ml/regression/RandomForestRegressorSuite#L233
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:141) org.apache.spark.ml.util.MLTest.createSparkSession(MLTest.scala:51) org.apache.spark.ml.util.MLTest.createSparkSession$(MLTest.scala:50) org.apache.spark.ml.regression.RandomForestRegressorSuite.createSparkSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession(SharedSparkSession.scala:116) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession$(SharedSparkSession.scala:114) org.apache.spark.ml.regression.RandomForestRegressorSuite.initializeSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll(SharedSparkSession.scala:124) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll$(SharedSparkSession.scala:123) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$sql$test$SharedSparkSession$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSession.beforeAll(SharedSparkSession.scala:46) org.apache.spark.sql.test.SharedSparkSession.beforeAll$(SharedSparkSession.scala:44) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$TempDirectory$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.TempDirectory.beforeAll(TempDirectory.scala:39) org.apache.spark.ml.util.TempDirectory.beforeAll$(TempDirectory.scala:38) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$MLTest$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.MLTest.beforeAll(MLTest.scala:55) org.apache.spark.ml.util.MLTest.beforeAll$(MLTest.scala:54) org.apache.spark.ml.regression.RandomForestRegressorSuite.beforeAll(RandomForestRegressorSuite.scala:46) org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:141) org.apache.spark.ml.util.MLTest.createSparkSession(MLTest.scala:51) org.apache.spark.ml.util.MLTest.createSparkSession$(MLTest.scala:50) org.apache.spark.ml.regression.RandomForestRegressorSuite.createSparkSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession(SharedSparkSession.scala:116) org.apache.spark.sql.test.SharedSparkSessionBase.initializeSession$(SharedSparkSession.scala:114) org.apache.spark.ml.regression.RandomForestRegressorSuite.initializeSession(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll(SharedSparkSession.scala:124) org.apache.spark.sql.test.SharedSparkSessionBase.beforeAll$(SharedSparkSession.scala:123) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$sql$test$SharedSparkSession$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.sql.test.SharedSparkSession.beforeAll(SharedSparkSession.scala:46) org.apache.spark.sql.test.SharedSparkSession.beforeAll$(SharedSparkSession.scala:44) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$TempDirectory$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.TempDirectory.beforeAll(TempDirectory.scala:39) org.apache.spark.ml.util.TempDirectory.beforeAll$(TempDirectory.scala:38) org.apache.spark.ml.regression.RandomForestRegressorSuite.org$apache$spark$ml$util$MLTest$$super$beforeAll(RandomForestRegressorSuite.scala:36) org.apache.spark.ml.util.MLTest.beforeAll(MLTest.scala:55) org.apache.spark.ml.util.MLTest.beforeAll$(MLTest.scala:54) org.apache.spark.ml.regression.RandomForestRegressorSuite.beforeAll(RandomForestRegressorSuite.scala:46) org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
Run / Check changes
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Base image build
The following actions use a deprecated Node.js version and will be forced to run on node20: docker/login-action@v2, actions/checkout@v3, docker/setup-qemu-action@v2. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: pyspark-errors
Docker pull failed with exit code 1, back off 5.925 seconds before retry.
Run / Build modules: pyspark-errors
Docker pull failed with exit code 1, back off 1.545 seconds before retry.
Run / Build modules: pyspark-pandas-slow
Docker pull failed with exit code 1, back off 1.141 seconds before retry.
Run / Build modules: pyspark-pandas-slow
Docker pull failed with exit code 1, back off 3.971 seconds before retry.
Run / Build modules: pyspark-pandas-connect
Docker pull failed with exit code 1, back off 4.109 seconds before retry.
Run / Build modules: pyspark-pandas-connect
Docker pull failed with exit code 1, back off 3.255 seconds before retry.
Run / Build modules: pyspark-sql, pyspark-mllib, pyspark-resource, pyspark-testing
Docker pull failed with exit code 1, back off 2.28 seconds before retry.
Run / Build modules: pyspark-sql, pyspark-mllib, pyspark-resource, pyspark-testing
Docker pull failed with exit code 1, back off 3.24 seconds before retry.
Run / Build modules: pyspark-pandas
Docker pull failed with exit code 1, back off 6.255 seconds before retry.
Run / Build modules: pyspark-pandas
Docker pull failed with exit code 1, back off 2.103 seconds before retry.
Run / Build modules: sparkr
Docker pull failed with exit code 1, back off 8.214 seconds before retry.
Run / Build modules: sparkr
Docker pull failed with exit code 1, back off 1.241 seconds before retry.
Run / Build modules: pyspark-connect
Docker pull failed with exit code 1, back off 6.473 seconds before retry.
Run / Build modules: pyspark-connect
Docker pull failed with exit code 1, back off 7.502 seconds before retry.
Run / Build modules: pyspark-pandas-slow-connect
Docker pull failed with exit code 1, back off 8.685 seconds before retry.
Run / Build modules: pyspark-pandas-slow-connect
Docker pull failed with exit code 1, back off 7.65 seconds before retry.
Run / Linters, licenses, dependencies and documentation generation
Docker pull failed with exit code 1, back off 9.226 seconds before retry.
Run / Linters, licenses, dependencies and documentation generation
Docker pull failed with exit code 1, back off 8.158 seconds before retry.
Run / Build modules: pyspark-core, pyspark-streaming, pyspark-ml
Docker pull failed with exit code 1, back off 8.762 seconds before retry.
Run / Build modules: pyspark-core, pyspark-streaming, pyspark-ml
Docker pull failed with exit code 1, back off 9.494 seconds before retry.
Run / Breaking change detection with Buf (branch-3.4)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Scala 2.13 build with SBT
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Run Docker integration tests
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Java 11 build with Maven
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Java 17 build with Maven
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: hive - slow tests
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Run TPC-DS queries with SF=1
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: sql - slow tests
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/setup-python@v4, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: catalyst, hive-thriftserver
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: sql - other tests
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/setup-python@v4, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Run Spark on Kubernetes Integration test
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: sql - extended tests
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/setup-python@v4, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Run / Build modules: hive - other tests
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/cache@v3, actions/setup-java@v3, actions/upload-artifact@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
Deprecation notice: v1, v2, and v3 of the artifact actions
The following artifacts were uploaded using a version of actions/upload-artifact that is scheduled for deprecation: "test-results-catalyst, hive-thriftserver--8-hadoop3-hive2.3", "test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--8-hadoop3-hive2.3", "test-results-docker-integration--8-hadoop3-hive2.3", "test-results-hive-- other tests-8-hadoop3-hive2.3", "test-results-hive-- slow tests-8-hadoop3-hive2.3", "test-results-sql-- extended tests-8-hadoop3-hive2.3", "test-results-sql-- other tests-8-hadoop3-hive2.3", "test-results-sql-- slow tests-8-hadoop3-hive2.3", "test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, mesos, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--8-hadoop3-hive2.3", "test-results-tpcds--8-hadoop3-hive2.3". Please update your workflow to use v4 of the artifact actions. Learn more: https://github.blog/changelog/2024-04-16-deprecation-notice-v3-of-the-artifact-actions/

Artifacts

Produced during runtime
Name Size Digest
test-results-catalyst, hive-thriftserver--8-hadoop3-hive2.3 Expired
3.04 MB
test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--8-hadoop3-hive2.3 Expired
2.8 MB
test-results-docker-integration--8-hadoop3-hive2.3 Expired
157 KB
test-results-hive-- other tests-8-hadoop3-hive2.3 Expired
1.11 MB
test-results-hive-- slow tests-8-hadoop3-hive2.3 Expired
948 KB
test-results-sql-- extended tests-8-hadoop3-hive2.3 Expired
3.35 MB
test-results-sql-- other tests-8-hadoop3-hive2.3 Expired
4.65 MB
test-results-sql-- slow tests-8-hadoop3-hive2.3 Expired
3.16 MB
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, mesos, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--8-hadoop3-hive2.3 Expired
2.15 MB
test-results-tpcds--8-hadoop3-hive2.3 Expired
22.6 KB