摘要: 1. 设置SCALA_VERSION2. 执行conf/spark-env.sh3. 设置CLASSPATH=4. 如果存在assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar,则添加[core|repl|mllib|bagel|graphx|streaming]/target/scala-$SCALA_VERSION/classes:/assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar如果不存在,则检测RELEASE目录,存 阅读全文
posted @ 2014-03-26 08:13 飞天虎 阅读(495) 评论(0) 推荐(0)
摘要: 1. 判断是否cygwin环境2. 设置SCALA_VERSION3. 设置SPARK_HOME4. 执行conf/spark-env.sh5. 如果运行类是org.apache.spark.deploy.master.Master或org.apache.spark.deploy.worker.Worker,设置SPARK_MEM=${SPARK_DAEMON_MEMORY:-512m}SPARK_DAEMON_JAVA_OPTS="$SPARK_DAEMON_JAVA_OPTS -Dspark.akka.logLifecycleEvents=true"OUR_JAVA_O 阅读全文
posted @ 2014-03-26 00:01 飞天虎 阅读(1173) 评论(0) 推荐(0)