Spark程序编译报错:
[INFO] Compiling 2 source files to E:\Develop\IDEAWorkspace\spark\target\classes at 1567004370534[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:3: error: object apache is not a member of package org[ERROR] import org.apache.spark.rdd.RDD[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:4: error: object apache is not a member of package org[ERROR] import org.apache.spark.{SparkConf, SparkContext}[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:12: error: not found: type SparkConf[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:12: error: not found: type SparkConf[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:14: error: not found: type SparkContext[ERROR] val sc: SparkContext = new SparkContext(sparkConf)[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:14: error: not found: type SparkContext[ERROR] val sc: SparkContext = new SparkContext(sparkConf)[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:18: error: not found: type RDD[ERROR] val data: RDD[String] = sc.textFile("E:\\Study\\BigData\\heima\\stage5\\2spark����\\words.txt")[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:20: error: not found: type RDD[ERROR] val words: RDD[String] = data.flatMap(_.split(" "))[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:22: error: not found: type RDD[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:24: error: not found: type RDD[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:27: error: not found: type RDD[ERROR] val ascResult: RDD[(String, Int)] = result.sortBy(_._2,false) //����[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:3: error: object apache is not a member of package org[ERROR] import org.apache.spark.{SparkConf, SparkContext}[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:4: error: object apache is not a member of package org[ERROR] import org.apache.spark.rdd.RDD[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext[ERROR] val sc: SparkContext = new SparkContext(sparkConf)[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext[ERROR] val sc: SparkContext = new SparkContext(sparkConf)[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:18: error: not found: type RDD[ERROR] val data: RDD[String] = sc.textFile(args(0))[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:20: error: not found: type RDD[ERROR] val words: RDD[String] = data.flatMap(_.split(" "))[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:22: error: not found: type RDD[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))[ERROR] ^[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:24: error: not found: type RDD[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)[ERROR] ^[ERROR] 21 errors found[INFO] ------------------------------------------------------------------------[INFO] BUILD FAILURE
原因:本地仓库有问题。很可能是原来的本地仓库路径太长了太深了,仓库本身没问题,因为我把原来的仓库拷贝到E:\Study\BigData\目录下,就能正常使用。
解决方法:
原来spark工程的maven本地仓库是:E:\Study\BigData\heima\stage5\1scala\scala3\spark课程需要的maven仓库\SparkRepository
后来我修改为:E:\Study\BigData\repository 就可以了。
转载于:https://www.cnblogs.com/mediocreWorld/p/11427088.html
相关资源:PyCharm字体包.rar