导入hbase02-spark

导入hbase01

本文引入spark hbase插入和批量插入。

批量插入:用BulkLoad方法,具体介绍查看导入hbase01

/**
  * Hbase 保存数据
  *
  **/
object TestHbaseDemo {
  val logger = LoggerFactory.getLogger(this.getClass)
  def insert(sc:SparkContext): Unit ={
    val tableName="test"

   

    val hbaseConf=HBaseConfiguration.create()

    val jobConf=new JobConf(hbaseConf)
    jobConf.set("hbase.zookeeper.property.clientPort", "2181")
    jobConf.set("hbase.zookeeper.quorum", "bigdata1")
    jobConf.set("zookeeper.znode.parent", "/hbase")
    jobConf.setOutputFormat(classOf[TableOutputFormat])
    jobConf.set(TableOutputFormat.OUTPUT_TABLE,tableName)

    val pairs=sc.parallelize

posted @ 2018-06-11 18:03  Dlimeng  阅读(7)  评论(0)    收藏  举报  来源