SparkStreaming任务保持运行,定时任务监控进程,保证不挂掉

cron任务:每隔1分钟启动脚本,检查进程是否运行。crontab -e   

*/1 * * * * bash /data/spark/test.sh

检查进程,如果进程挂掉,重新启动Spark任务:给sh脚本添加权限,chmod 777 xx.sh

#!/bin/sh
is_Engine_exist=$(ps aux | grep LbsStreamingEngineTJ | grep -v grep | wc -l)
 
if [ $is_Engine_exist -eq 0 ];then
 
        echo 'Process Engine is down'
 
        echo 'Bring Engine up'
 
        strDate=`date +%Y%m%d%H%M%S`
 
        strStart="start Engine ${strDate}"
 
        echo "${strStart}" >> /data1/log.txt
 
        nohup /data1/spark-1.6.0/bin/spark-submit --master  spark://localhost:7077 --name LbsStreamingEngineTJ --class com.datafactory.streaming.LbsStreamingEngineTJ --executor-memory 512m --total-executor-cores 2 /data1/work/datafactory-0.1.0-SNAPSHOT1023.jar &
 
        echo 'Bring Engine finished '
 
else
 
        strDate=`date +%Y%m%d%H%M%S`
 
        strRun="running ${strDate}"
 
        echo "${strRun}" >> /data1/log.txt
 
fi

 

posted @ 2018-11-14 12:13  Bread_Wang  阅读(2303)  评论(0编辑  收藏  举报