java – 在Local中执行示例Flink程序
发布时间:2020-12-14 05:51:34 所属栏目:Java 来源:网络整理
导读:我试图在本地模式下在Apache Flink中执行示例程序. import org.apache.flink.api.common.functions.FlatMapFunction;import org.apache.flink.api.java.DataSet;import org.apache.flink.api.java.ExecutionEnvironment;import org.apache.flink.api.java.tu
我试图在本地模式下在Apache Flink中执行示例程序.
import org.apache.flink.api.common.functions.FlatMapFunction; import org.apache.flink.api.java.DataSet; import org.apache.flink.api.java.ExecutionEnvironment; import org.apache.flink.api.java.tuple.Tuple2; import org.apache.flink.util.Collector; public class WordCountExample { public static void main(String[] args) throws Exception { final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment(); DataSet<String> text = env.fromElements( "Who's there?","I think I hear them. Stand,ho! Who's there?"); //DataSet<String> text1 = env.readTextFile(args[0]); DataSet<Tuple2<String,Integer>> wordCounts = text .flatMap(new LineSplitter()) .groupBy(0) .sum(1); wordCounts.print(); env.execute(); env.execute("Word Count Example"); } public static class LineSplitter implements FlatMapFunction<String,Tuple2<String,Integer>> { @Override public void flatMap(String line,Collector<Tuple2<String,Integer>> out) { for (String word : line.split(" ")) { out.collect(new Tuple2<String,Integer>(word,1)); } } } } 它给了我例外: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat at WordCountExample.main(WordCountExample.java:10) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.InputFormat at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 1 more 我究竟做错了什么? 我也使用过正确的罐子. 解决方法
在项目中添加三个Flink Jar文件作为依赖项是不够的,因为它们具有其他传递依赖性,例如在Hadoop上.
获得开发(和本地执行)Flink程序的工作设置的最简单方法是遵循使用Maven原型配置Maven项目的quickstart guide.可以将此Maven项目导入IDE. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |