摘要:
1:spark的算子分类 2:创建rdd的两种方式 2:spark python高级算子 1.mapPartitions 2.mapPartitionsWithIndex Similar to mapPartitions, but also provides a function with an i 阅读全文
posted @ 2017-03-10 12:48
willian_zhang
阅读(694)
评论(0)
推荐(0)
浙公网安备 33010602011771号