admin管理员组文章数量:1344335
How spark decides how many executor needs to be created ???
I have created a spark standalone cluster using following command
For master
cd %SPARK_HOME%
bin\spark-class.cmd .apache.spark.deploy.master.Master
For worker
cd %SPARK_HOME%
bin\spark-class.cmd .apache.spark.deploy.worker.Worker -c 4 -m 2G spark://192.168.0.111:7077
Please find UI for master and worker assigned
UI for spark master and worker assigned
And I am running my application using following spark config
spark=SparkSession.builder \
.appName("pyspark") \
.master("spark://192.168.0.111:7077")\
.config("spark.eventlog.enabled",'true')\
.config("spark.history.ui.port","19000")\
.config("spark.executor.instances", "1")\
.config("spark.executor.cores", "2")\
.config("spark.memory.fraction", "0.8")\
.config("spark.dynamicAllocation.enabled", "false")\
.config("spark.history.fs.logDirectory",r'file:///C:/Users/asus/Downloads/Code/Spark-logs')\
.getOrCreate()
But when I submitting my application in spark UI I can see 2 executor are created with 2 cores as shown :-
In my application I have one write csv action
Spark UI stage image
I have disable the spark.dynamicAllocation.enabled in spark config and after that my spark application is created with 2 executor .
Can some one explain why it is creating 2 executor instead of 1 .And how spark property decide how many executor need
本文标签: apache sparkPyspark no of executor created more than expectedStack Overflow
版权声明:本文标题:apache spark - Pyspark no of executor created more than expected - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743798430a2540863.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论