scala – 为什么这个Spark示例代码不会加载到spark-shell中?
发布时间:2020-12-16 19:16:33 所属栏目:安全 来源:网络整理
导读:下面的示例代码来自Advanced Analytics with Spark一书.当我将它加载到spark- shell(版本1.4.1)时,它会给出以下错误,表明它找不到StatCounter: import org.apache.spark.util.StatCounterconsole:9: error: not found: type StatCounter val stats: StatCou
下面的示例代码来自Advanced Analytics with Spark一书.当我将它加载到spark-
shell(版本1.4.1)时,它会给出以下错误,表明它找不到StatCounter:
import org.apache.spark.util.StatCounter <console>:9: error: not found: type StatCounter val stats: StatCounter = new StatCounter() ^ <console>:9: error: not found: type StatCounter val stats: StatCounter = new StatCounter() ^ <console>:23: error: not found: type NAStatCounter def apply(x: Double) = new NAStatCounter().add(x) 如果我只是在spark-shell中执行以下操作,则没有问题: scala> import org.apache.spark.util.StatCounter import org.apache.spark.util.StatCounter scala> val statsCounter: StatCounter = new StatCounter() statsCounter: org.apache.spark.util.StatCounter = (count: 0,mean: 0.000000,stdev: NaN,max: -Infinity,min: Infinity) 问题似乎与spark-shell中的:load命令有关. 这是代码: import org.apache.spark.util.StatCounter class NAStatCounter extends Serializable { val stats: StatCounter = new StatCounter() var missing: Long = 0 def add(x: Double): NAStatCounter = { if (java.lang.Double.isNaN(x)) { missing += 1 } else { stats.merge(x) } this } def merge(other: NAStatCounter): NAStatCounter = { stats.merge(other.stats) missing += other.missing this } override def toString = { "stats: " + stats.toString + " NaN: " + missing } } object NAStatCounter extends Serializable { def apply(x: Double) = new NAStatCounter().add(x) } 解决方法
我和你有完全相同的问题.
我试着解决它, 更改 val stats: StatCounter = new StatCounter() INTO val stats: org.apache.spark.util.StatCounter = new org.apache.spark.util.StatCounter() 原因或许是系统不知道StatCounter的路径 (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |