spark实现wordcount的几种方法方法一:map + reduceByKey方法二:使用countByValue代替map + reduceByKey方法三:aggregateByKey或者foldByKey方法四:groupByKey+map方法五:Scala原生实现wordcount方法六:combineByKey
方法一:map + reduceByKeypackage com.cw.bigdata.spark.wordcountimport org.apache.spark.rdd.RDDimport org.apache.spark.{SparkConf, SparkContext}/** * WordCount实现第一种方式:map + reduceByKey * * @author 陈小哥cw * @date 2020/7/9 9:59 */object WordCount1