hive导入数据,数据清洗,导入mysql,echarts显示

1  向hive中导入数据

  在hive中先建表,表的列名要和即将导入的数据对应

create table test333(ip string,itime string,day  string,traffic bigint,type string,id STRING)  
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' 
STORED AS TEXTFILE;

      把本地的数据导入hive

load data local inpath '/opt/software/result2.csv' overwrite into table test2;

2  在hive中进行数据清洗,也可以清洗完再导入

insert overwrite table data
select ip,
       from_unixtime(to_unix_timestamp(`time`,'dd/MMM/yyy:HH:mm:ss Z')) as `time`,
       day,
       traffic,
       type,
       id
        from data  ;

3  将hive的数据用sqoop导入mysql

bin/sqoop export --connect jdbc:mysql://hadoop102:3306/company --username root --password 001224 --table business --num-mappers 1 --export-dir /user/hive/warehouse/csv3 --input-fields-terminated-by "," --driver com.mysql.jdbc.Driver  ;

4  JDBC连接mysql,查询数据后返回给echarts

    protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        response.setContentType("text/html;charset=utf-8");
        request.setCharacterEncoding("utf-8");
        ArrayList<city> book = new ArrayList<city>();
        select dao=new select();
        System.out.println("hahahahah ");
        try {
            dao.mostPopularIP(book);
            System.out.println("1");
        } catch (ClassNotFoundException e) {
            e.printStackTrace();
            System.out.println("2");
        } catch (SQLException e) {
            e.printStackTrace();
            System.out.println("3");
        }
        System.out.println("*****************");
            String json = JSON.toJSONString(book);
            response.getWriter().write(json);
    }

 

posted @ 2022-10-18 20:36  Cuora  阅读(171)  评论(0)    收藏  举报