You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 9, 2020. It is now read-only.
I am trying to test a local spark script.
Whenever I try to upload a file from local Mac system to qinikube cluster, I get the below error.
$SPARK_HOME/bin/spark-submit \ --deploy-mode cluster \ --master k8s://https://192.168.99.100:8443 \ --kubernetes-namespace spark \ --conf spark.executor.instances=1 \ --conf spark.executor.memory=512m \ --conf spark.driver.memory=512m \ --conf spark.app.name=spark-pi \ --conf spark.executor.cores=0.2 \ --conf spark.driver.cores=0.2 \ --conf spark.kubernetes.driver.docker.image=kubespark/spark-driver-py:v2.2.0-kubernetes-0.5.0 \ --conf spark.kubernetes.executor.docker.image=kubespark/spark-executor-py:v2.2.0-kubernetes-0.5.0 \ --jars local:///opt/spark/examples/jars/spark-examples_2.11-2.2.0-k8s-0.5.0.jar \ --py-files schools.py \ schools.py
I see the below Error in dashboard for the driver pod
Image: kubespark/spark-driver-py:v2.2.0-kubernetes-0.5.0
Environment variables
Commands: -
Args: -
The text was updated successfully, but these errors were encountered: