admin管理员组文章数量:1122826
I'm trying to run the out-of-the-box pi example that ships with spark operator.
- Image source: apache/spark-py
- Spark operator version: 2.0.2
- Spark version: 3.5.2
- Started the spark app with
kubectl apply -f pi.yaml
apiVersion: sparkoperator.k8s.io/v1beta2
kind: SparkApplication
metadata:
name: pi
namespace: spark
spec:
type: Python
pythonVersion: "3"
mode: cluster
image: apache/spark-py
imagePullPolicy: IfNotPresent
mainApplicationFile: local://opt/spark/examples/src/main/python/pi.py
sparkVersion: 3.5.2
sparkConf:
spark.kubernetes.file.upload.path: /opt/spark/jars
# Driver Pod - responsible for kicking off the job
driver:
labels:
version: 3.5.2
cores: 1
memory: 5120m
serviceAccount: spark-service-account
# Executor Pod
executor:
labels:
version: 3.5.2
instances: 1
cores: 1
memory: 5120m
Here is the quirky part. When I look at the sparkapps, there's nothing! I can describe into the sparkapp, however it just gives me standard stuff from the yaml file. There's no events at all. I don't even know where to begin debugging this.
user@host ~/machine_learning/helm % k get sparkapp
NAME STATUS ATTEMPTS START FINISH AGE
pi 19m
本文标签:
版权声明:本文标题:spark-operator on kubernetes, starting the pi example but the sparkapplication is nothing? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1736301336a1931142.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论