Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
356 views
in Technique[技术] by (71.8m points)

java - local class incompatible Exception: when running spark standalone from IDE

I begin to test spark. I installed spark on my local machine and run a local cluster with a single worker. when I tried to execute my job from my IDE by setting the sparconf as follows:

final SparkConf conf = new SparkConf().setAppName("testSparkfromJava").setMaster("spark://XXXXXXXXXX:7077");
final JavaSparkContext sc = new JavaSparkContext(conf);
final JavaRDD<String> distFile = sc.textFile(Paths.get("").toAbsolutePath().toString() + "dataSpark/datastores.json");*

I got this exception:

java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

It can be multiple incompatible reasons below:

  • Hadoop version;
  • Spark version;
  • Scala version;
  • ...

For me, its Scala version , I using 2.11.X in my IDE but official doc says:

Spark runs on Java 7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.1 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).

and the x in the doc told cannot be smaller than 3 if you using latest Java(1.8), cause this. Hope it will help you!


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...