spark常见问题定位

1.eclipse下报java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
在https://github.com/srccodes/hadoop-common-2.2.0-bin页面下载hadoop-common-2.2.0-bin-master.zip,解压,设置HADOOP_HOME环境变量为解压后的地址
参考资料:http://my.oschina.net/cloudcoder/blog/286234
2.eclipse下报ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.

查看spark的log目录下的spark-hadoop-org.apache.spark.deploy.master.Master-1-h1.out
./bin/spark-submit --class org.apache.spark.examples.SparkPi  --master spark://h1:7077  --executor-memory 3000m  --total-executor-cores 100  ./lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000
将命令的ip换成host
3.spark java.lang.StackOverflowError
a.检查下有没有死循环
b.深层次的递归
c.设定jvm参数:-Xss10m
4.WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

./bin/run-example SparkPi --spark.executor.memory 3000m
5.Cannot change version of project facet Dynamic Web Module to 2.5.
http://blog.csdn.net/steveguoshao/article/details/38414145

6.Exception in thread "main" java.lang.NoSuchFieldError: DEF_CONTENT_CHARSET

有两个版本的http-core,从pom中去掉一个

posted on 2015-10-15 15:14  巧天工  阅读(1272)  评论(1编辑  收藏  举报

导航