sparksql踩坑
这里写自定义目录标题今天使用sparksql时遇到如下问题It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName查阅了网上资料说是由于sparkSession封装在actor中,每个actor都有自己独占的sparkSession,有些sql是保存数据到hive和hdfs上,