02 2019 档案

摘要:启动 [root@node1 kafka]# ./bin/kafka-server-start.sh -daemon config/server.properties 创建主题 ./bin/kafka-topics.sh --create --zookeeper 192.168.23.101:218 阅读全文
posted @ 2019-02-28 22:05 VIP8cnbl 阅读(127) 评论(0) 推荐(0)
摘要:在没有配置kafka 删除属性的情况下 使用删除主题命令 ./bin/kafka-topics.sh --delete --zookeeper 192.168.28.131:2181,192.168.28.131:2182,192.168.28.131:2183 --topic test之后对当前主 阅读全文
posted @ 2019-02-28 22:04 VIP8cnbl
摘要:常用命令 1.新建主题 ./bin/kafka-topics.sh --create --zookeeper 192.168.28.131:2181,192.168.28.131:2182,192.168.28.131:2183 --replication-factor 3 --partitions 阅读全文
posted @ 2019-02-28 22:02 VIP8cnbl
摘要:工具:IdeaScala:版本2.10.6 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.6.0</version> </depend 阅读全文
posted @ 2019-02-28 22:01 VIP8cnbl 阅读(245) 评论(0) 推荐(0)
摘要:原创,未经同意转载,复制的没唧唧 def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.set("spark.master", "local") conf.set("spark.app.name", "spar 阅读全文
posted @ 2019-02-27 21:44 VIP8cnbl
摘要:[root@node3 ~]# cd /usr/local/kafka/[root@node3 kafka]# ./bin/kafka-server-start.sh -daemon config/server.properties[root@node3 kafka]# jps2944 Quorum 阅读全文
posted @ 2019-02-27 14:34 VIP8cnbl
摘要:package sparkSql.方法1创建DataFrameimport org.apache.spark.sql.{SQLContext, SaveMode}import org.apache.spark.{SparkConf, SparkContext}object Student { def 阅读全文
posted @ 2019-02-25 22:34 VIP8cnbl
摘要:package sparkSql.方法1创建DataFrame;import org.apache.spark.SparkConf;import org.apache.spark.api.java.JavaRDD;import org.apache.spark.api.java.JavaSparkC 阅读全文
posted @ 2019-02-25 22:04 VIP8cnbl
摘要:package com.day09import org.apache.spark.rdd.RDDimport org.apache.spark.{SparkConf, SparkContext}/** * 需求:在一定时间方位内,求出用户在所有基站停留的时长 */object JzDemo { de 阅读全文
posted @ 2019-02-21 21:34 VIP8cnbl
摘要:package studentimport java.net.URLimport org.apache.spark.rdd.RDDimport org.apache.spark.{SparkConf, SparkContext}object StuObject { def main(args: Ar 阅读全文
posted @ 2019-02-21 21:05 VIP8cnbl
摘要:package ipAndAccressimport org.apache.spark.rdd.RDDimport org.apache.spark.{SparkConf, SparkContext}import scala.io.Sourcecase object EtlIp { def main 阅读全文
posted @ 2019-02-21 20:47 VIP8cnbl
摘要:/usr/local/spark-1.6.3/bin/spark-submit \> --class org.apache.spark.examples.SparkPi \> --master spark://node1:7077 \> --executor-memory 521m \> --tot 阅读全文
posted @ 2019-02-19 21:08 VIP8cnbl 阅读(279) 评论(0) 推荐(0)