02 2017 档案
摘要:1:安装好虚拟机,安装系统 2:更改Master和Slave的时间,使其时间相同:具体执行(root 用户下) 1): yum install -y ntpdate 2):cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime cp ...
阅读全文
摘要:StringBuffer sb = new StringBuffe(); sb.append(str[1] + "#XT#"); sb.append(str[2]+"#XT#"); sb.append("\n"); byte date[] = sb.toString().getBytes(); InputStream in = new ByteArrayInputStream(date); in...
阅读全文
摘要:flume-ng agent -c conf conf/-f flume-http-logger.properties -Dflume.root.logger=DEBUG,console -n agent1 ll, now that we've defined all of our components, tell # agent1 which ones we want to activat...
阅读全文
摘要:1:解压 2:在config中配置server.properties broker.id=1 port=9092 host.name=Master socket.send.buffer.bytes=1048576 socket.receive.buffer.bytes=1048576 socket.r...
阅读全文
摘要:import kafka.consumer.ConsumerConfig; import kafka.consumer.KafkaStream; import kafka.javaapi.consumer.ConsumerConnector; import kafka.serializer.StringDecoder; import kafka.utils.VerifiableP...
阅读全文
摘要:import java.util.Properties; import kafka.javaapi.producer.Producer; import kafka.producer.KeyedMessage; import kafka.producer.ProducerConfig; import kafka.serializer.StringEncoder; import org.apache...
阅读全文
摘要:try { ServerAddress serverAddress = new ServerAddress("localhost", 27017); List addrs = new ArrayList(); addrs.add(serverAddress); MongoCredential credential = MongoCredential.cre...
阅读全文
摘要:import com.mongodb.hadoop.MongoOutputFormat import org.apache.hadoop.conf.Configuration import org.apache.spark.rdd.RDD import org.apache.spark.{SparkConf, SparkContext} import org.bson.BasicBSONObje...
阅读全文
摘要:netstat -tnlp jps -lm 查看所有进程及其对应的进程号 hdfs dfs -cat /test/2016041211 |wc -l 查看hadoop中的文件的条数 查看端口占用:netstat -apn | grep 9527 查看磁盘使用情况 : df -h 查看内存使用情况 : free -m 查看单个文件夹大小: du -bs dir_na...
阅读全文
摘要:import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.util.ArrayList; import java.util.Enumeration; import java.util.List; import org.apache.tools....
阅读全文
浙公网安备 33010602011771号