解决Spark中无法调用toDF方法
导入以下代码:
//导入隐饰操作,否则RDD无法调用toDF方法 import org.apache.spark.sql.SparkSession val spark = SparkSession.builder.master("local[4]").getOrCreate import spark.implicits._
或者使用直接createDataFrame方法代替toDF,将Spark中将RDD转换成DataFrame。代码如下:
//StructType and convert RDD to DataFrame val schema = StructType( Seq( StructField("name",StringType,true) ,StructField("age",IntegerType,true) ) ) val rowRDD = sparkSession.sparkContext .textFile("/tmp/people.txt",2) .map( x => x.split(",")).map( x => Row(x(0),x(1).trim().toInt)) sparkSession.createDataFrame(rowRDD,schema) } }

浙公网安备 33010602011771号