Make sure Spark is compatible with corresponding Scala version
The error is common when using Scala version 2.12
series with any version of Spark offering Scala 2.11
.
You can try using the 2.11
series of Scala with Spark . i.e.
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
As you can see in this dependency spark-core_2.11
is associated with scala version 2.11
.
That's why it's safer (more compatible) to use %%
and avoid hardcoding the version of Scala in Spark dependencies. Let the tool resolve the required Scala version for you automatically as follows:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
The above declaration will automatically infer the scala version.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…