摘要: There are two ways to create context in Spark SQL: SqlContext:scala> import org.apache.spark.sql._scala> var sqlContext = new SQLContext(sc) HiveConte 阅读全文
posted @ 2016-03-17 16:23 回家的流浪者 阅读(546) 评论(0) 推荐(0) 编辑