09 2020 档案

摘要:package spark.demo object MyApp2 { // 部分适用 def msg (from: String, to: String, text: String) = s"($from -> $to): $text" def main(args: Array[String]): 阅读全文
posted @ 2020-09-25 16:35 初入门径 阅读(614) 评论(0) 推荐(0)
摘要:package spark.demoobject MyApp { def main(args: Array[String]): Unit = { // 函数柯理化 val multiFunc = (a: Int, b: Int) => a * b val multiFuncCurried = (a: 阅读全文
posted @ 2020-09-25 16:33 初入门径 阅读(189) 评论(0) 推荐(0)
摘要:scala> val add = (x: Int, y: Int) => x + y add: (Int, Int) => Int = <function2> scala> // add: (Int, Int) => Int = <function2> scala> scala> val addCu 阅读全文
posted @ 2020-09-25 16:09 初入门径 阅读(92) 评论(0) 推荐(0)
摘要:######################### 高阶函数 ######################### scala> def sum(x: Int) = x + 1 sum: (x: Int)Int scala> println(sum(5)) 6 Lamda表达式 scala> val 阅读全文
posted @ 2020-09-24 21:56 初入门径 阅读(138) 评论(0) 推荐(0)
摘要:scala> val number = Seq(20, 40, 60) number: Seq[Int] = List(20, 40, 60) scala> val numbers = (x: Int) => x * 2 numbers: Int => Int = <function1> scala 阅读全文
posted @ 2020-09-24 21:33 初入门径 阅读(115) 评论(0) 推荐(0)
摘要:scala> val f: String => Int = arg => arg.toInt * 2 f: String => Int = <function1> scala> f res0: String => Int = <function1> scala> f("5") res1: Int = 阅读全文
posted @ 2020-09-24 20:23 初入门径 阅读(176) 评论(0) 推荐(0)
摘要:package spark.demo object Demo { def main(args: Array[String]) { val m: Map[Int, String] = Map(3 -> "Python", 1 -> "Java", 2 -> "Scala", 6 -> "SQL") / 阅读全文
posted @ 2020-09-23 21:35 初入门径 阅读(1673) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ jps 7174 Kafka 7502 Jps [root@centos00 ~]$ PIDS=$(jps -lm | grep -i 'kafka.Kafka'| awk '{print $1}') [root@centos00 ~]$ kill -9 ${P 阅读全文
posted @ 2020-09-22 15:32 初入门径 阅读(504) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ ./stop-kafka.sh -bash: ./stop-kafka.sh: /bin/sh^M: bad interpreter: No such file or directory 解决: 编码集Windows(CR LF)更改为Unix(LF) 阅读全文
posted @ 2020-09-21 22:13 初入门径 阅读(1112) 评论(0) 推荐(0)
摘要:高阶函数的基本构成 def 方法名(函数名: (函数的参数类型) => 函数的返回值类型) { 处理的具体内容 } 案例 package com.spark.demo object HigherOrderFunction { def main(args: Array[String]): Unit = 阅读全文
posted @ 2020-09-18 16:47 初入门径 阅读(512) 评论(0) 推荐(0)
摘要:/* Navicat MySQL Data Transfer Source Server : localhost Source Server Version : 50527 Source Host : localhost:3306 Source Database : mydb Target Serv 阅读全文
posted @ 2020-09-18 10:15 初入门径 阅读(420) 评论(0) 推荐(0)
摘要:scala> val df = Seq(1.to(5).mkString(",")).toDF("number") df: org.apache.spark.sql.DataFrame = [number: string] scala> df.show(false) + + |number | + 阅读全文
posted @ 2020-09-17 21:56 初入门径 阅读(101) 评论(0) 推荐(0)
摘要:/* Navicat MySQL Data Transfer Source Server : localhost Source Server Version : 50527 Source Host : localhost:3306 Source Database : mydb Target Serv 阅读全文
posted @ 2020-09-17 21:53 初入门径 阅读(147) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ cd hadoop-2.6.0-cdh5.14.2/ [root@centos00 hadoop-2.6.0-cdh5.14.2]$ sbin/hadoop-daemon.sh start namenode [root@centos00 hadoop-2.6.0 阅读全文
posted @ 2020-09-17 19:53 初入门径 阅读(161) 评论(0) 推荐(0)
摘要:①不包含~ 匹配不包含字符串「abc」的行 ^(?!.*abc).*$ ②不包含~或者~ 匹配不包含字符串「abc」或者「efg」的行 ^(?!.*(abc|efg)).*$ ③不以~为开始 匹配不以字符串「abc」为开始的行 ^(?!abc).*$ ④不以~为结束 匹配不以字符串「abc」为结束的 阅读全文
posted @ 2020-09-17 18:48 初入门径 阅读(173) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ cd hadoop-2.6.0-cdh5.14.2/ [root@centos00 hadoop-2.6.0-cdh5.14.2]$ sbin/hadoop-daemon.sh start namenode [root@centos00 hadoop-2.6.0 阅读全文
posted @ 2020-09-14 12:02 初入门径 阅读(2662) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ cd hadoop-2.6.0-cdh5.14.2/ [root@centos00 hadoop-2.6.0-cdh5.14.2]$ sbin/hadoop-daemon.sh start namenode [root@centos00 hadoop-2.6.0 阅读全文
posted @ 2020-09-14 11:48 初入门径 阅读(2484) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ cd /opt/cdh5.14.2/spark-2.2.1-cdh5.14.2/jars [root@centos00 jars]$ pwd /opt/cdh5.14.2/spark-2.2.1-cdh5.14.2/jars [root@centos00 jar 阅读全文
posted @ 2020-09-13 15:49 初入门径 阅读(1285) 评论(0) 推荐(1)
摘要:[root@centos00 hadoop-2.6.0-cdh5.14.2]$ sbin/hadoop-daemon.sh start namenode [root@centos00 hadoop-2.6.0-cdh5.14.2]$ sbin/hadoop-daemon.sh start datan 阅读全文
posted @ 2020-09-08 12:09 初入门径 阅读(446) 评论(0) 推荐(0)
摘要:package spark.demo import org.apache.spark.sql.{DataFrame, SparkSession} import org.apache.kudu.spark.kudu._ /** * <dependency> * <groupId>org.apache. 阅读全文
posted @ 2020-09-07 21:49 初入门径 阅读(334) 评论(0) 推荐(0)
摘要:# cd /opt/cdh5.14.2/hadoop-2.6.0-cdh5.14.2/ # sbin/hadoop-daemon.sh start namenode # sbin/hadoop-daemon.sh start datanode # sbin/yarn-daemon.sh start 阅读全文
posted @ 2020-09-06 10:30 初入门径 阅读(1380) 评论(0) 推荐(0)
摘要:package spark.demo import java.sql.Timestamp object Transaction{ def main(args: Array[String]): Unit = { val s: String = "2020-09-02 15:00:00" val t: 阅读全文
posted @ 2020-09-02 13:33 初入门径 阅读(4395) 评论(0) 推荐(0)
摘要:[root@centos00 ~]$ cd /opt/cdh5.14.2/hadoop-2.6.0-cdh5.14.2/ [root@centos00 hadoop-2.6.0-cdh5.14.2]$ sbin/hadoop-daemon.sh start namenode [root@centos 阅读全文
posted @ 2020-09-01 15:01 初入门径 阅读(2573) 评论(0) 推荐(0)
摘要:[root@centos00 ~]# rsync -auvzr -e "ssh " install.log 192.168.255.255:/root/ The authenticity of host '192.168.255.255 (192.168.255.255)' can't be est 阅读全文
posted @ 2020-09-01 10:39 初入门径 阅读(414) 评论(0) 推荐(0)