1、Spark_RDD算子——Map
一、SparkUtils工具类import org.apache.spark.{SparkConf, SparkContext}object SparkUtils { /** * 默认的master url路径 */ val DEFAULT_MASTER = "local[*]" /** * 默认master为local[*]的获取sparkContext */ def getSparkContext(appName:String):SparkCon