scala – Spark2.1.0兼容杰克逊版本2.7.6
发布时间:2020-12-16 09:26:05 所属栏目:安全 来源:网络整理
导读:我试图在intellij中运行一个简单的火花示例,但我得到的错误是这样的: Exception in thread "main" java.lang.ExceptionInInitializerErrorat org.apache.spark.SparkContext.withScope(SparkContext.scala:701)at org.apache.spark.SparkContext.textFile(S
我试图在intellij中运行一个简单的火花示例,但我得到的错误是这样的:
Exception in thread "main" java.lang.ExceptionInInitializerError at org.apache.spark.SparkContext.withScope(SparkContext.scala:701) at org.apache.spark.SparkContext.textFile(SparkContext.scala:819) at spark.test$.main(test.scala:19) at spark.test.main(test.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.7.6 at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64) at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19) at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:730) at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82) at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) 我试图更新我的杰克逊依赖,但它似乎不起作用,我这样做: libraryDependencies += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7" libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7" 但它仍然出现相同的错误消息,有人可以帮我修复错误吗? 这是spark示例代码: object test { def main(args: Array[String]): Unit = { if (args.length < 1) { System.err.println("Usage: <file>") System.exit(1) } val conf = new SparkConf() val sc = new SparkContext("local","wordcount",conf) val line = sc.textFile(args(0)) line.flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect().foreach(println) sc.stop() } } 这是我的built.sbt: name := "testSpark2" version := "1.0" scalaVersion := "2.11.8" libraryDependencies += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7" libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7" libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-repl_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume_2.10" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-network-shuffle_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume-assembly_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mesos_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-catalyst_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-launcher_2.11" % "2.1.0" 解决方法
Spark 2.1.0包含com.fasterxml.jackson.core作为传递依赖.因此,我们不需要在libraryDependencies中包含.
但是如果你想添加一个不同的com.fasterxml.jackson.core依赖项版本,那么你必须覆盖它们.喜欢这个: name := "testSpark2" version := "1.0" scalaVersion := "2.11.8" dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7" dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7" dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.7" libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-repl_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-network-shuffle_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume-assembly_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mesos_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-catalyst_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-launcher_2.11" % "2.1.0" 所以,改变你的build.sbt就像上面那样,它将按预期工作. 我希望它有所帮助! (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |
相关内容