Skip to content

Use condition instead of errorClass #1260

Use condition instead of errorClass

Use condition instead of errorClass #1260

Triggered via push September 10, 2024 15:49
Status Failure
Total duration 1h 39m 12s
Artifacts 18

build_main.yml

on: push
Run  /  Check changes
55s
Run / Check changes
Run  /  Base image build
53s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
1m 0s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Run TPC-DS queries with SF=1
1h 37m
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
1h 26m
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
1h 0m
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
18s
Run / Run Spark UI tests
Matrix: Run / build
Run  /  Build modules: sparkr
26m 49s
Run / Build modules: sparkr
Run  /  Linters, licenses, and dependencies
16m 40s
Run / Linters, licenses, and dependencies
Run  /  Documentation generation
46m 24s
Run / Documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

12 errors and 8 warnings
Run / Linters, licenses, and dependencies
Process completed with exit code 1.
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-fcb57e91dcb97dc7-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-138cef91dcba6443-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$625/0x00007fb1bc4f9f68@4232d495 rejected from java.util.concurrent.ThreadPoolExecutor@10fda11b[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 371]
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$625/0x00007fb1bc4f9f68@36790b97 rejected from java.util.concurrent.ThreadPoolExecutor@10fda11b[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 370]
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-170f4891dccd3d79-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-9009c691dcce25d0-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-39b64891dcd1ce60-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-69d97639d57e48d2aa3ad7e910cdde12-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-69d97639d57e48d2aa3ad7e910cdde12-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
Run / Run Docker integration tests
Process completed with exit code 18.
Run / Build modules: pyspark-pandas
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-pandas-connect-part1
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-pandas-connect-part2
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-pandas-connect-part3
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-pandas-connect-part0
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-pandas-slow
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-sql, pyspark-resource, pyspark-testing
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.

Artifacts

Produced during runtime
Name Size
MaxGekk~spark~XK9ZYA.dockerbuild Expired
21.9 KB
site Expired
60.8 MB
test-results-api, catalyst, hive-thriftserver--17-hadoop3-hive2.3 Expired
630 KB
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, variant--17-hadoop3-hive2.3 Expired
792 KB
test-results-docker-integration--17-hadoop3-hive2.3 Expired
42.2 KB
test-results-hive-- other tests-17-hadoop3-hive2.3 Expired
230 KB
test-results-hive-- slow tests-17-hadoop3-hive2.3 Expired
218 KB
test-results-mllib-local, mllib, graphx--17-hadoop3-hive2.3 Expired
469 KB
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11 Expired
169 KB
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.11 Expired
190 KB
test-results-sparkr--17-hadoop3-hive2.3 Expired
17.2 KB
test-results-sql-- extended tests-17-hadoop3-hive2.3 Expired
1.07 MB
test-results-sql-- other tests-17-hadoop3-hive2.3 Expired
1.27 MB
test-results-sql-- slow tests-17-hadoop3-hive2.3 Expired
1.07 MB
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect--17-hadoop3-hive2.3 Expired
363 KB
test-results-tpcds--17-hadoop3-hive2.3 Expired
4.88 KB
test-results-yarn--17-hadoop3-hive2.3 Expired
42.3 KB
unit-tests-log-docker-integration--17-hadoop3-hive2.3 Expired
42 MB