scala – saveTocassandra找不到参数rwf的隐含值
发布时间:2020-12-16 19:17:12 所属栏目:安全 来源:网络整理
导读:我正在尝试使用spark scala在Cassandra数据库中保存数据集,但是在运行代码时遇到异常: 使用链接: http://rustyrazorblade.com/2015/01/introduction-to-spark-cassandra/ error:could not find implicit value for parameter rwf: com.datastax.spark.conn
我正在尝试使用spark
scala在Cassandra数据库中保存数据集,但是在运行代码时遇到异常:
使用链接: http://rustyrazorblade.com/2015/01/introduction-to-spark-cassandra/ error: could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFctory[FoodToUserIndex] food_index.saveToCassandra("tutorial","food_to_user_index") ^ .scala def main(args: Array[String]): Unit = { val conf = new SparkConf(true) .set("spark.cassandra.connection.host","localhost") .set("spark.executor.memory","1g") .set("spark.cassandra.connection.native.port","9042") val sc = new SparkContext(conf) case class FoodToUserIndex(food: String,user: String) val user_table = sc.cassandraTable[CassandraRow]("tutorial","user").select("favorite_food","name") val food_index = user_table.map(r => new FoodToUserIndex(r.getString("favorite_food"),r.getString("name"))) food_index.saveToCassandra("tutorial","food_to_user_index")} build.sbt name := "intro_to_spark" version := "1.0" scalaVersion := "2.11.2" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.0-rc3" 如果将scala和cassandra连接器的版本更改为2.10,1.1.0就可以了.但我需要使用scala 2.11: scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0" withSources() withJavadoc() 解决方法
移动案例类FoodToUserIndex(food:String,user:String)在main函数之外应该解决问题.
(编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |