摘要:
pyspark rdd 数据持久化 from pyspark import SparkContext ,SparkConfconf=SparkConf().setAppName("miniProject").setMaster("loc... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(10)
评论(0)
推荐(0)
摘要:
from pyspark import SparkContext ,SparkConfconf=SparkConf().setApp... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(13)
评论(0)
推荐(0)
摘要:
pyspark 连接mysql 1:载mysql-connector 放入 jars下2:在spark-env.sh中 配置EXTRA_SPARK_CLASSPATH环境变量3:export SPARK_CLASSPATH=/opt/s... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(8)
评论(0)
推荐(0)
摘要:
import happybase#连接connection = happybase.Connection('localhost')c... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(17)
评论(0)
推荐(0)
摘要:
import happybase#连接connection = happybase.Connection('localhost')c... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(15)
评论(0)
推荐(0)
摘要:
python 操作 hbase import happybase#连接connection = happybase.Connection('localhost')connection.open() 创建一个table con... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(8)
评论(0)
推荐(0)
摘要:
python 操作 hbase import happybase#连接connection = happybase.Connection('localhost')connection.open() 创建一个table con... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(11)
评论(0)
推荐(0)
摘要:
向spark standalone集群提交任务 文档链接 #切换到spark安装目录,执行下面一条命令,192.168.0.10是master的ip, examples/src/main/python/pi.py 是python 文件的... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(6)
评论(0)
推荐(0)
摘要:
向spark standalone集群提交任务 文档链接 #切换到spark安装目录,执行下面一条命令,192.168.0.10是master的ip, examples/src/main/python/pi.py 是python 文件的... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(10)
评论(0)
推荐(0)
摘要:
文档链接 #切换到spark安装目录,执行下面一条命令,192.168.0.10是master的ip, examples/src/m... 阅读全文
posted @ 2022-08-19 22:54
luoganttcc
阅读(8)
评论(0)
推荐(0)

浙公网安备 33010602011771号