摘要:
#直接调用sortByKey()函数就可以做到 from pyspark import SparkContext sc = SparkContext('local','Sort') list = ["7","4","8","2","5"] textFile = sc.parallelize(list 阅读全文
posted @ 2018-08-01 12:15
Bean_zheng
阅读(547)
评论(0)
推荐(0)
摘要:
#基于python的spark #导入pyspark库 from pyspark import SparkContext #配置SparkContext sc = SparkContext('local','wordcount') #创建一个新的RDD,加载本地文件 textFile = sc.te 阅读全文
posted @ 2018-08-01 11:01
Bean_zheng
阅读(602)
评论(0)
推荐(0)

浙公网安备 33010602011771号