cdh安装spark遇到的几个BUG
spark安装后启动:
[zdwy@master spark]$ sbin/start-all.sh 
starting org.apache.spark.deploy.master.Master, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
failed to launch org.apache.spark.deploy.master.Master:
  	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  	... 7 more
full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
slave1: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out
slave1: failed to launch org.apache.spark.deploy.worker.Worker:
slave1:   	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
slave1:   	... 6 more
slave1: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: failed to launch org.apache.spark.deploy.worker.Worker:
slave2:   	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
slave2:   	... 6 more
slave2: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out
原因:缺少hadoop和spark之间通信的jar包
解决方案:下载3个jar包:jackson-core-xxx.jar,jackson-annotations-xxx.jar,jackson-databind-xxx.jar,下载地址:http://mvnrepository.com/artifact/com.fasterxml.jackson.core/
下载后将jar包放入到hadoop/share/hadoop/commom/目录下,重新启动spark即可。
 
                    
                     
                    
                 
                    
                
 
 
                
            
         
         浙公网安备 33010602011771号
浙公网安备 33010602011771号