安装spark遇到的问题

1.启动spark SQL时,报错:

   Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException:

The specified datastore driver ("com.mysql.jdbc.Driver ") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
解决方案
$SPARK_HOME/conf/spark-env.sh文件中配置:export SPARK_CLASSPATH=$HIVE_HOME/lib/mysql-connector-java-5.1.6-bin.jar
posted @ 2019-12-16 23:49  阿豪吖  阅读(549)  评论(0编辑  收藏  举报