摘要:1.准备数据employee.txt1001,Gong Shaocheng,11002,Li Dachao,11003,Qiu Xin,11004,Cheng Jiangzhong,21005,Wo Binggang,3将数据放入hdfs[root@jfp3-1 spark-studio]# hdf...
阅读全文
摘要:1.在三个节点上安装JDK RPM2.在三个节点上安装HADOOP-1.2.1 RPMrpm方式安装和gz解压方式安装后的目录结构有些不同.安装好之后无需设置HADOOP_HOME环境变量[root@server-914 usr]# whereis hadoophadoop: /usr/bin/hadoop /etc/hadoop /usr/etc/hadoop /usr/include/hadoop /usr/share/hadoop可执行文件在/usr/bin/hadoop,之前在conf目录下的配置文件都在/etc/hadoop下,/usr/etc/hadoop是指向/etc/hadoo
阅读全文
摘要:1.通读http://spark.incubator.apache.org/docs/latest/spark-standalone.html2.在每台机器上将spark安装到/opt/spark3.在第一台机器上启动spark master.[root@jfp3-1 latest]# ./sbin/start-master.sh在logs目录查看日志:[root@jfp3-1 latest]# tail -100f logs/spark-root-org.apache.spark.deploy.master.Master-1-jfp3-1.out Spark Command: /usr/ja
阅读全文
摘要:1. 下载scala并安装。版本为2.10.3。设置SCALA_HOME和PATH环境变量2. 下载SPARK 0.9.0源代码并解压到/root/Downloads/spark-0.9.0-incubatinghttp://www.apache.org/dyn/closer.cgi/incubator/spark/spark-0.9.0-incubating/spark-0.9.0-incubating.tgz注意,也可以下载已经编译好的包。见:http://www.apache.org/dyn/closer.cgi/incubator/spark/spark-0.9.0-incubatin
阅读全文
摘要:1.安装mvn2.下载源代码3.buildmvn package过程中出现问题,clojars.org 访问不了。通过私服映射clojars.org并在pom.xml中将dependency的地址改掉。重新运行mvn package发现问题:java.lang.RuntimeException: Pipe to subprocess seems to be broken! No output read.Shell Process Exception:/tmp/a3a2aead-499f-4f93-8390-0650f2d75d0f/supervisor/stormdist/test-1-139
阅读全文
摘要:0.搭建ftp服务器并建立yum源1.在每个节点上安装java并设置环境变量2.在三个节点上安装zookeeper3.安装zeromq过程中发现运行./configure时出现问题:configure: error: no acceptable C compiler found in $PATH运行以下命令即可:yum install gcc-c++configure: error: cannot link with -luuid, install uuid-dev.运行以下命令即可:yum install libuuid-devel4.安装jzmq在运行autogen.sh中间出现问题:au
阅读全文
摘要:总体介绍虚拟机4台,分布在1个物理机上,配置基于hadoop的集群中包括4个节点: 1个 Master, 3个 Salve,i p分布为:10.10.96.33 hadoop1 (Master)10.10.96.59 hadoop2 (Slave)10.10.96.65 hadoop3 (Slave)10.10.96.64 hadoop4 (Slave)操作系统为Red Hat Enterprise Linux Server release 6.4,GNU/Linux 2.6.32Master机器主要配置NameNode和JobTracker的角色,负责总管分布式数据和分解任务的执 行;3个S
阅读全文
摘要:1. 在hbase上建测试表hbase(main):003:0> create 'test_hive_over_hbase','f'0 row(s) in 2.5810 secondshbase(main):004:0> put 'test_hive_over_hbase','1001','f:DATA','2012|shaochen'0 row(s) in 0.2010 secondshbase(main):005:0> put 'test_hive_over_hbase
阅读全文
摘要:伪分布式单节点安装执行pi失败:[root@server-518 ~]# ./bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar pi 5 10出错信息:Number of Maps = 5Samples per Map = 1013/12/10 11:04:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where.
阅读全文
摘要:1.下载java 7并安装[root@server-518 ~]# rpm -ivh jdk-7u40-linux-x64.rpmPreparing... ########################################### [100%] 1:jdk ########################################### [100%]Unpacking JAR files... rt.jar... jsse.jar... charsets.jar....
阅读全文
摘要:1.在第2个个节点上重复http://www.cnblogs.com/littlesuccess/p/3361497.html文章中的第1-5步2.修改第1个节点上的hdfs-site.xml中的配置份数为3[root@server-305 ~]# vim /opt/hadoop/etc/hadoop/hdfs-site.xml dfs.replication 3 3.修改第一个节点上的yarn-site.xml中的yarn resourcemanager地址[root@server-306 hadoop]# vi yarn-site.xml yarn.resour...
阅读全文
摘要:1. Download HBASE from hbase.apache.org and install it to /home/shgong/hbase2. Setup java home2.1. Change working directory to /home/shgong/hbase/conf/2.2. Edit the hbase-env.sh# export JAVA_HOME=/usr/java/jdk1.6.0/export JAVA_HOME=/home/shgong/jdk/3. Setup hbase data file directory3.1. Make a dirmk
阅读全文
摘要:Problem:对于hive中的日期字符串(yyyy-mm-dd HH:MM:SS.ffffffff,总会抛出DateFormatExceptionAnalysis:Resolution:
阅读全文
摘要:Problem:java.lang.NumberFormatException at java.math.BigDecimal.<init>(BigDecimal.java:459) at java.math.BigDecimal.<init>(BigDecimal.java:728) at CDR_D_DETAIL_LUC.__loadFromFields(CDR_D_DETAIL_LUC.java:9803) at CDR_D_DETAIL_LUC.parse(CDR_D_DETAIL_LUC.java:9630) at...
阅读全文
摘要:-----------------install hadoop----------------------1. download java 6 jdk-6u34-linux-i586.bin2. install java chmod 777 jdk-6u34-linux-i586.bin ./jdk-6u34-linux-i586.bin3. download install hadoop4. set environment variables: sudo gedit /etc/environment export JAVA_HOME=/home/shgong/dev/jdk1...
阅读全文
摘要:Problem: there is an imcomptable issue when importing widgets data from mysql to sqoop.Solution: change hadoop to hadoop 1.0.1 when hadoopguide says its code has been tested.
阅读全文