加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 编程开发 > Java > 正文

java – 缺少应用程序资源

发布时间:2020-12-15 02:18:42 所属栏目:Java 来源:网络整理
导读:用 java编写spark程序,代码如下: import org.apache.spark.SparkConf;import org.apache.spark.api.java.JavaRDD;import org.apache.spark.api.java.JavaSparkContext;import org.apache.spark.api.java.function.Function;public class SimpleApp { public
用 java编写spark程序,代码如下:

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;

public class SimpleApp {
  public static void main(String[] args) {
      SparkConf conf = new SparkConf().setAppName("wordCount").setMaster("local"); 
      JavaSparkContext sc = new JavaSparkContext(conf); 
      JavaRDD<String> input = sc.textFile("/bigdata/softwares/spark-2.1.0-bin-hadoop2.7/testdata/a.txt");
      System.out.println();
      Long bCount = input.filter(new Function<String,Boolean>(){
         public Boolean call(String s){return s.contains("yes");}
      }).count();
      Long cCount = input.filter(new Function<String,Boolean>(){
         public Boolean call(String s){return s.contains("ywq");}
      }).count();
      System.out.println("yes:"+bCount+"  ywq:"+cCount+"  all:");
//    sc.stop();
  }
}

Pom如下:

<dependencies>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>      
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.3</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

发生以下错误
Maven将所有资源打包成jar文件,运行时报告以下错误,我刚开始学习,谁知道教,谢谢
enter image description here

解决方法

您还必须使用spark-submit指定主类

spark-submit --class <your.package>.SimpleApp testjar/spark-0.0.1-SNAPSHOT.jar

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读