Spark Context

In any distributed application, it is common to have a driver program that controls the
execution and there will be one or more worker nodes. The driver program allocates the
tasks to the appropriate workers.

Spark

 

Spark run with YARN

 

Spark run with Mesos

 

posted @ 2017-04-09 11:04  ordi  阅读(158)  评论(0)    收藏  举报