scala – IntelliJ中SBT项目的未解析依赖项路径
发布时间:2020-12-16 19:24:01 所属栏目:安全 来源:网络整理
导读:我正在使用IntelliJ开发Spark应用程序.我正在关注如何让intellij与SBT项目很好地合作. 由于我的整个团队都在使用IntelliJ,所以我们可以修改build.sbt,但是我们得到了这个未解决的依赖项错误 错误:导入SBT项目时出错: [info] Resolving org.apache.thrift#l
我正在使用IntelliJ开发Spark应用程序.我正在关注如何让intellij与SBT项目很好地合作.
由于我的整个团队都在使用IntelliJ,所以我们可以修改build.sbt,但是我们得到了这个未解决的依赖项错误 错误:导入SBT项目时出错: [info] Resolving org.apache.thrift#libfb303;0.9.2 ... [info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ... [info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ... [info] Resolving org.apache.spark#spark-parent_2.10;2.1.0 ... [info] Resolving org.scala-lang#jline;2.10.6 ... [info] Resolving org.fusesource.jansi#jansi;1.4 ... [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] [warn] Note: Unresolved dependencies path: [warn] sparrow-to-orc:sparrow-to-orc_2.10:0.1 [warn] +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT [trace] Stack trace suppressed: run 'last mainRunner/:ssExtractDependencies' for the full output. [trace] Stack trace suppressed: run 'last mainRunner/:update' for the full output. [error] (mainRunner/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [error] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [error] Total time: 47 s,completed Jun 10,2017 8:39:57 AM 这是我的build.sbt name := "sparrow-to-orc" version := "0.1" scalaVersion := "2.11.8" lazy val sparkDependencies = Seq( "org.apache.spark" %% "spark-core" % "2.1.0","org.apache.spark" %% "spark-sql" % "2.1.0","org.apache.spark" %% "spark-hive" % "2.1.0","org.apache.spark" %% "spark-streaming" % "2.1.0" ) libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4" libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1" libraryDependencies ++= sparkDependencies.map(_ % "provided") lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings( libraryDependencies ++= sparkDependencies.map(_ % "compile") ) assemblyMergeStrategy in assembly := { case PathList("org","aopalliance",xs @ _*) => MergeStrategy.last case PathList("javax","inject","servlet","activation",xs @ _*) => MergeStrategy.last case PathList("org","apache",xs @ _*) => MergeStrategy.last case PathList("com","google","esotericsoftware","codahale","yammer",xs @ _*) => MergeStrategy.last case "about.html" => MergeStrategy.rename case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last case "META-INF/mailcap" => MergeStrategy.last case "META-INF/mimetypes.default" => MergeStrategy.last case "plugin.properties" => MergeStrategy.last case "log4j.properties" => MergeStrategy.last case "overview.html" => MergeStrategy.last case x => val oldStrategy = (assemblyMergeStrategy in assembly).value oldStrategy(x) } run in Compile <<= Defaults.runTask(fullClasspath in Compile,mainClass in (Compile,run),runner in (Compile,run)) 如果我没有这条线,那么程序运行正常 lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings( libraryDependencies ++= sparkDependencies.map(_ % "compile") ) 但是后来我将无法在IntelliJ中运行应用程序,因为类似路径中不会包含spark依赖项. 解决方法
我遇到过同样的问题.解决方案是将mainRunner中的Scala版本设置为与build.sbt文件顶部声明的版本相同:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings( libraryDependencies ++= sparkDependencies.map(_ % "compile"),scalaVersion := "2.11.8" ) 祝好运! (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |
相关内容
- linux – 无法附加到运行Docker容器的bash
- twitter-bootstrap – Twitter Bootstrap:没有匹配的Typea
- axis2客户端调用免费的webservice服务的实例
- AngularJS和Handlebars – 都需要或不需要
- angularjs – 我应该使用Angular.copy()或_.clone()?
- Webservice报错java.lang.NoSuchMethodError: org.springfr
- 如何使用命令重命名@ Angular / cli创建的服务
- 让IE支持Bootstrap的解决方法
- 如何在Bash中使用尾部获取文件的最后一行空行?
- 在bash中缩进输出的特定部分