加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 综合聚焦 > 服务器 > 安全 > 正文

scala – IntelliJ中SBT项目的未解析依赖项路径

发布时间:2020-12-16 19:24:01 所属栏目:安全 来源:网络整理
导读:我正在使用IntelliJ开发Spark应用程序.我正在关注如何让intellij与SBT项目很好地合作. 由于我的整个团队都在使用IntelliJ,所以我们可以修改build.sbt,但是我们得到了这个未解决的依赖项错误 错误:导入SBT项目时出错: [info] Resolving org.apache.thrift#l
我正在使用IntelliJ开发Spark应用程序.我正在关注如何让intellij与SBT项目很好地合作.

由于我的整个团队都在使用IntelliJ,所以我们可以修改build.sbt,但是我们得到了这个未解决的依赖项错误

错误:导入SBT项目时出错:

[info] Resolving org.apache.thrift#libfb303;0.9.2 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-parent_2.10;2.1.0 ...
[info] Resolving org.scala-lang#jline;2.10.6 ...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]      sparrow-to-orc:sparrow-to-orc_2.10:0.1
[warn]        +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT
[trace] Stack trace suppressed: run 'last mainRunner/:ssExtractDependencies' for the full output.
[trace] Stack trace suppressed: run 'last mainRunner/:update' for the full output.
[error] (mainRunner/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] Total time: 47 s,completed Jun 10,2017 8:39:57 AM

这是我的build.sbt

name := "sparrow-to-orc"

version := "0.1"

scalaVersion := "2.11.8"

lazy val sparkDependencies = Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0","org.apache.spark" %% "spark-sql" % "2.1.0","org.apache.spark" %% "spark-hive" % "2.1.0","org.apache.spark" %% "spark-streaming" % "2.1.0"
)

libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"

libraryDependencies ++= sparkDependencies.map(_ % "provided")

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

assemblyMergeStrategy in assembly := {
  case PathList("org","aopalliance",xs @ _*) => MergeStrategy.last
  case PathList("javax","inject","servlet","activation",xs @ _*) => MergeStrategy.last
  case PathList("org","apache",xs @ _*) => MergeStrategy.last
  case PathList("com","google","esotericsoftware","codahale","yammer",xs @ _*) => MergeStrategy.last
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case "overview.html" => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

run in Compile <<= Defaults.runTask(fullClasspath in Compile,mainClass in (Compile,run),runner in (Compile,run))

如果我没有这条线,那么程序运行正常

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

但是后来我将无法在IntelliJ中运行应用程序,因为类似路径中不会包含spark依赖项.

解决方法

我遇到过同样的问题.解决方案是将mainRunner中的Scala版本设置为与build.sbt文件顶部声明的版本相同:

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
    libraryDependencies ++= sparkDependencies.map(_ % "compile"),scalaVersion := "2.11.8"
)

祝好运!

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读