加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 综合聚焦 > 服务器 > 安全 > 正文

scala – 在intellij中运行spark想法HttpServletResponse – Cla

发布时间:2020-12-16 18:05:28 所属栏目:安全 来源:网络整理
导读:我尝试从Intellij Idea内部使用 Scala运行Spark: object SimpleApp { def main(args: Array[String]) { val logFile = "/home/kamil/Apps/spark-1.2.1-bin/README.md" // Should be some file on your system val conf = new SparkConf().setAppName("Simpl
我尝试从Intellij Idea内部使用 Scala运行Spark:

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/home/kamil/Apps/spark-1.2.1-bin/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile,2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s,Lines with b: %s".format(numAs,numBs))
  }
}

在spark-submit中运行它可以正常工作.从IDE运行它会导致以下错误:

Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/http/HttpServletResponse
    at org.apache.spark.HttpServer.org$apache$spark$HttpServer$$doStart(HttpServer.scala:74)
    at org.apache.spark.HttpServer$$anonfun$1.apply(HttpServer.scala:61)
    at org.apache.spark.HttpServer$$anonfun$1.apply(HttpServer.scala:61)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1765)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1756)
    at org.apache.spark.HttpServer.start(HttpServer.scala:61)
    at org.apache.spark.HttpFileServer.initialize(HttpFileServer.scala:46)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:320)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:240)
    at SimpleApp$.main(SimpleApp.scala:8)
    at SimpleApp.main(SimpleApp.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException: javax.servlet.http.HttpServletResponse
    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 18 more

SimpleApp.scala:8是实例化spark上下文的行.有人建议我已添加:

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1"

但它没有帮助.你有什么想法?提前致谢.

解决方法

我自己刚刚解决了这个问题.您需要更改模块设置.

上下文菜单 – >打开模块设置 – >依赖

将缺少的jar的“范围”从“提供”更改为“编译”.

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读