PySpark - Running on Yarn mode in Jupyter


I am trying PySpark scripts in Jupyter notebook. May i know how to run PySpark script on top of Yarn?
Basically, i would need to know what to replace in place of local in below statement?

conf = SparkConf().setMaster(“local”).setAppName(“TemperatureFilter”)

Hi @senthiljdpm,

For running Spark on YARN, replace the local with “yarn” in the above statement

Also on the command line use below command to launch pyspark on top of YARN.

pyspark --master yarn

Hope this helps.