07 2021 档案

摘要:sqoop从msyq导入数据到hive,自定建表 sqoop import \ --connect 'jdbc:mysql://xxx.xxx.xxx.xx:3306/database?useUnicode=true&characterEncoding=utf8&zeroDateTimeBehavi 阅读全文
posted @ 2021-07-15 09:08 nohert 阅读(109) 评论(0) 推荐(0)
摘要:sqoop从hive导出到mysql报错: failed with state FAILED due to: Task failed 原因:是hive数据导入mysql时,有字段内容超出了mysql设置的字段长度,所以报错了 解决: yarn logs -applicationId applicat 阅读全文
posted @ 2021-07-12 15:49 nohert 阅读(586) 评论(0) 推荐(0)
摘要:package com.lezhi.business.dxxbs.transmission.table import com.lezhi.common.{CommonTransmissonFunciton, SystemParams} import org.apache.flink.streamin 阅读全文
posted @ 2021-07-12 11:30 nohert 阅读(1661) 评论(0) 推荐(0)
摘要:今天在执行insert语句的时候,发现hive报错 ERROR : Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(Permission denied: user=hive 阅读全文
posted @ 2021-07-09 15:06 nohert 阅读(702) 评论(0) 推荐(0)
摘要:问题一 使用命令提交flink任务 flink run -c com.lezhi.business.dxxbs.transmission.ExecuteDML /data/jar/gkt-bigData-flink-1.0-SNAPSHOT-jar-with-dependencies.jar --i 阅读全文
posted @ 2021-07-09 15:02 nohert 阅读(3906) 评论(0) 推荐(0)
摘要:flink actions are "run", "list", "info", "savepoint", "stop", or "cancel".Specify the version option (-v or --version) to print Flink version. Specify 阅读全文
posted @ 2021-07-07 19:00 nohert 阅读(1483) 评论(0) 推荐(0)