Cannot run multiple SparkContexts at once; existing SparkContext

Launched jupyter with the following commands
export SPARK_HOME="/usr/spark2.0.2/"

export PYTHONPATH=$SPARK_HOME/python/:$SPARK_HOME/python/lib/py4j-0.10.3-src.zip:$SPARK_HOME/python/lib/pyspark.zip:$PYTHONPATH

export PATH=/usr/local/anaconda/bin:$PATH

jupyter notebook --no-browser --ip 0.0.0.0 --port 8888

when I run the below code,
from pyspark import SparkContext, SparkConf
conf = SparkConf().setAppName(“appName”)
sc = SparkContext(conf=conf)
rdd = sc.textFile("/data/mr/wordcount/input/")
print rdd.take(10)
sc.version

I keep getting the following error, I restarted server couple of times already


ValueError Traceback (most recent call last)
in ()
1 from pyspark import SparkContext, SparkConf
2 conf = SparkConf().setAppName(“appName”)
----> 3 sc = SparkContext(conf=conf)
4 #rdd = sc.textFile("/data/mr/wordcount/input/")
5 #print rdd.take(10)

/usr/spark2.0.2/python/pyspark/context.pyc in init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
110 “”"
111 self._callsite = first_spark_call() or CallSite(None, None, None)
–> 112 SparkContext._ensure_initialized(self, gateway=gateway)
113 try:
114 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,

/usr/spark2.0.2/python/pyspark/context.pyc in _ensure_initialized(cls, instance, gateway)
257 " created by %s at %s:%s "
258 % (currentAppName, currentMaster,
–> 259 callsite.function, callsite.file, callsite.linenum))
260 else:
261 SparkContext._active_spark_context = instance

ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=appName, master=local[*]) created by init at :3

In case you have not found the solution yet, use below-

sc = SparkContext.getOrCreate()

instead of ‘sc = SparkContext(conf=conf)’