java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString

hadoop mr 或者 spark 操作 hbase时候就出现这个错误
这是hbase的bug,可在jira上看到该问题:https://issues.apache.org/jira/browse/HBASE-10304

报错信息:

15/08/17 19:28:33 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString ca 
nnot access its superclass com.google.protobuf.LiteralByteString 
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString 
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:210) 
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:121) 
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:90) 
        at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:264) 
        at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:169) 
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:164) 
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:107) 
        at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:736) 
        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:178) 
        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:82) 
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.isTableAvailable(HConnectionManager.java:962) 
        at org.apache.hadoop.hbase.client.HBaseAdmin.isTableAvailable(HBaseAdmin.java:1081) 
        at org.apache.hadoop.hbase.client.HBaseAdmin.isTableAvailable(HBaseAdmin.java:1089) 
        at com.umeng.dp.yuliang.play.HBaseToES$.main(HBaseToES.scala:28) 
        at com.umeng.dp.yuliang.play.HBaseToES.main(HBaseToES.scala) 
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
        at java.lang.reflect.Method.invoke(Method.java:606) 
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483) 
Caused by: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString 
        at java.lang.ClassLoader.defineClass1(Native Method) 
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800) 
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) 
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) 
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71) 
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361) 
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
        at java.security.AccessController.doPrivileged(Native Method) 
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:930) 
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildScanRequest(RequestConverter.java:434) 
        at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:297) 
        at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:157) 
        at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:57) 
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114) 
        ... 18 more 

hadoop yarn 解决方案:

  1. 提交作业方式

    $ export HADOOP_CLASSPATH="/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar"
    $ ./hadoop-2.2.0/bin/hadoop --config /home/stack/conf_hadoop/ jar ./hbase/hbase-assembly/target/hbase-0.99.0-SNAPSHOT-job.jar  org.apache.hadoop.hbase.mapreduce.RowCounter usertable
  2. 增加HADOOP_CLASSPATH到linux环境变量中
    增加如下内容到bashrc 或者 bash_profile 或者 profile ,这样是linux环境变量中就行
    export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar

spark 解决方案:

  1. 提交作业方式
    -conf 增加spark.driver.extraClassPath & spark.executor.extraClassPath

    spark-submit --class com.umeng.dp.yuliang.play.HBaseToES --master yarn-cluster --conf "spark.driver.extraClassPath=/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar" --conf "spark.executor.extraClassPath=/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar"   --jars /home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar ScalaMR-0.0.1-jar-with-dependencies.jar 
  2. 增加如下配置到$SPARK_HOME/conf/spark-defaults.conf文件

    spark.driver.extraClassPath /home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar
    spark.executor.extraClassPath /home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar
    

spark maillist:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalAccessError-class-com-google-protobuf-HBaseZeroCopyByteString-cannot-access-its-supg-tc24303.html
尊重原创,未经允许不得转载:
http://blog.csdn.net/stark_summer/article/details/47750017

版权声明:本文为博主原创文章,未经博主允许不得转载。

posted @ 2015-08-18 12:01  stark_summer  阅读(2234)  评论(0编辑  收藏  举报