Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
204 views
in Technique[技术] by (71.8m points)

pyspark - configuration for 1 worker node cluster in spark

I am running spark standalone on 1 node cluster(1 driver, 1 worker node). The driver node have configuration: 8 cores, 16 gb ram, worker node: 24gb ram,8 core.

Is it possible to leverage driver node also with worker in better way like by having executor running on it effectively? Can somebody please help me to configure the params for this configuration?

Thanks and Regards,
Sudip

question from:https://stackoverflow.com/questions/66063508/configuration-for-1-worker-node-cluster-in-spark

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...