vmware虚拟机转hyper虚拟机

主要是转硬盘格式,使用StarWind V2V Image Converter,选择vmware中的硬盘,转为hyper格式。

因为原有虚拟机中存在hadoop,spark,hive,mysql,tomcat,hbase,docker,所以改动各种配置:

1.修改硬盘中ip, /etc/network/interfaces中的ip等信息。sudo ifconfig eth0 down,sudo ifconfig eth0 up

2.修改sudo vim /etc/hosts中各个master,slave等hadoop主机ip

3.修改spark中vim spark-1.6.0-bin-hadoop2.6/conf/spark-env.sh中ip

4.修改hive中sudo vim hive-site.xml中mysql的ip

~/hadoop/sbin/start-dfs.sh
~/hadoop/sbin/start-yarn.sh
~/spark-1.6.0-bin-hadoop2.6/sbin/start-history-server.sh hdfs://hadoop:9000/user/spark/eventlog
~/hive-2.1.0/bin/hive  --service metastore &
启动dfs,yarn,spark-history,hive

---------------------------------------------------------------

sparklauncher启动spark程序

java -jar ~/demo/mysparklauncher.jar yarn ~/spark-1.6.0-bin-hadoop2.6/ wordcount  ~/demo/myspark.jar hdfs://hadoop:9000/user/hdfs/log_kpi/log.txt

java -jar ~/demo/mysparklauncher.jar yarn ~/spark-1.6.0-bin-hadoop2.6/ OnlineBlackListFilter  ~/demo/myspark.jar 

java -jar ~/demo/mysparklauncher.jar yarn ~/spark-1.6.0-bin-hadoop2.6/ com.myspark.test.wordcount ~/demo/spark.jar

-----------------------
spark-history server中:
application)_1**代表yarn运行的
app-2016**代表master,worker运行的
local-14**代表本地local运行的

posted @ 2023-07-01 19:43  Arlan  阅读(58)  评论(0)    收藏  举报  来源