Getting error while launching 1.6 spark-submit job on yarn

i am trying to execute a simple pyspark script (below) with yarn, all my attempt seems going pointless.

from pyspark import SparkContext, SparkConf
import sys

def main(spctx, inputfile, outloc):
tripdata = spctx.textFile(inputfile).map(lambda line : line.split(’,’))
vendor = tripdata.map(lambda x: (x[0], 1)).reduceByKey(lambda x, y: x + y).sortByKey(ascending= True)
vendor.saveAsTextFile(outloc)
if name == “main”:
for i in range(len(sys.argv)):
print(’{} : {}’.format(i+1, sys.argv[i]))
if len(sys.argv) < 2:
print(‘ERROR: missing input and output paths!!’)
sys.exit(1)

indata = sys.argv[1]
outdata = sys.argv[2]
print('input data: {} and out data: {}'.format(indata, outdata))
spcnf = SparkConf().setAppName('YellowTrip_Vendor_Filter').setMaster('yarn-client')
spctx = SparkContext(conf=spcnf)
spctx.setLogLevel("ERROR")
main(spctx, indata, outdata)

and the way i am trying to submit the script is:-

/usr/spark1.6/bin/spark-submit --master yarn-client
–num-executors 2 --executor-memory 1G --executor-cores 2
–driver-memory 1G
–conf spark.hadoop.yarn.timeline-service.enabled=false
/home/xxxxxxxx_13516/bigformal.py hdfs:///user/xxxxxxx_13516/saavn_sample_data.txt sparkout

every single attempt is landing me to an error, when i checked the yarn log with applicationId, it says

Container: container_e137_1527045214830_5262_01_000001 on ip-172-31-20-58.ec2.internal_45454

LogType:stderr
Log Upload Time:Fri Jun 22 09:14:20 +0000 2018
LogLength:87
Log Contents:
Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher
End of LogType:stderr

Anyone knows the answer to this problem? I really appreciate it if anyone can shed some light to this problem.

Tarun