概述
用 sbt 打包 scala 程序时,遇到如下错误:
[error] /home/hadoop/sparkapp/src/main/scala/RecommendApp.scala:25:144: value toDF is not a member of org.apache.spark.rdd.RDD[Movie]
[error] val moviesDF = moviesRDD.map(x => Movie (x.split("::")(0).toInt,x.split("::")(1).replaceAll("[0-9()]","").trim,x.split("::")(2).trim)).toDF
[error] ^
[error] /home/hadoop/sparkapp/src/main/scala/SimpleApp.scala:33:61: value toDF is not a member of org.apache.spark.rdd.RDD[Rating]
[error] val ratingsDF = ratingsRDD.map(parseRating).toDF
[error] ^
经授课老师指点,解决上述错误的过程中,有两个地方需要我们注意:
1、启用隐式转换时,需要在 main 函数中自行创建 SparkSession 对象,然后使用该对象来启用隐式转换,而非在 object 对象之前启用。
2、case class 类的声明需要放在 main 函数之前。
import org.apache.spark.sql.SparkSession
# import spark.implicits._ 错误示范
object RecommendApp {
case class Rating(user_id:Int, movie_id:Int, rating:Float, timestamp:Long)
case class Movie(movie_id:Int, movie_name:String, movie_type:String)
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val spark= SparkSession.builder().getOrCreate()
import spark.implicits._
# 错误示范
# case class Rating(user_id:Int, movie_id:Int, rating:Float, timestamp:Long)
# case class Movie(movie_id:Int, movie_name:String, movie_type:String)
最后
以上就是友好诺言为你收集整理的踩坑记录之spark:value toDF is not a member of org.apache.spark.rdd.RDD的全部内容,希望文章能够帮你解决踩坑记录之spark:value toDF is not a member of org.apache.spark.rdd.RDD所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
发表评论 取消回复