I am new to Apache Spark, and I just learned that Spark supports three types of cluster:
- Standalone - meaning Spark will manage its own cluster
- YARN - using Hadoop's YARN resource manager
- Mesos - Apache's dedicated resource manager project
I think I should try Standalone first. In the future, I need to build a large cluster (hundreds of instances).
Which cluster type should I choose?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…