admin管理员组

文章数量:1122826

I'm trying to run the out-of-the-box pi example that ships with spark operator.

  • Image source: apache/spark-py
  • Spark operator version: 2.0.2
  • Spark version: 3.5.2
  • Started the spark app with kubectl apply -f pi.yaml
apiVersion: sparkoperator.k8s.io/v1beta2
kind: SparkApplication
metadata:
  name: pi
  namespace: spark
spec:
  type: Python
  pythonVersion: "3"
  mode: cluster
  image: apache/spark-py
  imagePullPolicy: IfNotPresent
  mainApplicationFile: local://opt/spark/examples/src/main/python/pi.py
  sparkVersion: 3.5.2
  sparkConf:
    spark.kubernetes.file.upload.path: /opt/spark/jars
  # Driver Pod - responsible for kicking off the job
  driver:
    labels:
      version: 3.5.2
    cores: 1
    memory: 5120m
    serviceAccount: spark-service-account
  # Executor Pod
  executor:
    labels:
      version: 3.5.2
    instances: 1
    cores: 1
    memory: 5120m

Here is the quirky part. When I look at the sparkapps, there's nothing! I can describe into the sparkapp, however it just gives me standard stuff from the yaml file. There's no events at all. I don't even know where to begin debugging this.

user@host ~/machine_learning/helm % k get sparkapp
NAME   STATUS   ATTEMPTS   START   FINISH   AGE
pi                                          19m

本文标签: