Regarding the resource/cluster manager such as Hadoop Yarn, Amazon EC2, and others, the Standalone type is covered by Spark. I would like to ask what particular operating systems (in place of Yarn) are acceptable to function with Spark with respect to the Standalone type? Please advise the workflow as well.
Spark can run in multiple modes, please refer answer below:
5. On which all platform can Apache Spark run?
1 Like