Pyspark Error in Console

[sarithadsr217850@cxln4 ~]$ pyspark
SPARK_MAJOR_VERSION is set to 1, using Spark
File “/bin/hdp-select”, line 232
print "ERROR: Invalid package - " + name
^
SyntaxError: Missing parentheses in call to ‘print’. Did you mean print("ERROR: Invalid package - " + name)?
Fatal Python error: Py_Initialize: can’t initialize sys standard streams
Traceback (most recent call last):
File “/usr/local/anaconda/lib/python3.6/io.py”, line 52, in
File “/home/sarithadsr217850/abc.py”, line 2, in
File “/usr/hdp/current/spark-client/python/pyspark/init.py”, line 40, in
File “/usr/hdp/current/spark-client/python/pyspark/conf.py”, line 60, in
File “/usr/local/anaconda/lib/python3.6/re.py”, line 122, in
File “/usr/local/anaconda/lib/python3.6/enum.py”, line 2, in
File “/usr/local/anaconda/lib/python3.6/types.py”, line 171, in
File “/usr/local/anaconda/lib/python3.6/functools.py”, line 20, in
ImportError: cannot import name ‘get_cache_token’
ls: cannot access /usr/hdp//hadoop/lib: No such file or directory
Traceback (most recent call last):
File “/usr/lib64/python2.7/site.py”, line 62, in
import os
File “/usr/lib64/python2.7/os.py”, line 398, in
import UserDict
File “/usr/lib64/python2.7/UserDict.py”, line 83, in
import _abcoll
File “/usr/lib64/python2.7/_abcoll.py”, line 11, in
from abc import ABCMeta, abstractmethod
File “abc.py”, line 2, in
from pyspark import SparkContext
File “/usr/hdp/current/spark-client/python/pyspark/init.py”, line 41, in
from pyspark.context import SparkContext
File “/usr/hdp/current/spark-client/python/pyspark/context.py”, line 21, in
import shutil
File “/usr/lib64/python2.7/shutil.py”, line 12, in
import collections
File “/usr/lib64/python2.7/collections.py”, line 6, in
all += _abcoll.all
AttributeError: ‘module’ object has no attribute ‘all

Did you set python path such that it launches python 3 by default?

yes ,i set the path eventhough i am getting the error. iam getting following error.Please help me.Thank you.

[sarithadsr217850@cxln5 ~]$ export PATH=/usr/local/anaconda/bin:$PATH
[sarithadsr217850@cxln5 ~]$ pyspark
SPARK_MAJOR_VERSION is set to 2, using Spark2
File “/bin/hdp-select”, line 232
print "ERROR: Invalid package - " + name
^
SyntaxError: Missing parentheses in call to ‘print’. Did you mean print("ERROR: Invalid package - " + name)?
Fatal Python error: Py_Initialize: can’t initialize sys standard streams
Traceback (most recent call last):
File “/usr/local/anaconda/lib/python3.6/io.py”, line 52, in
File “/home/sarithadsr217850/abc.py”, line 2, in
File “/usr/hdp/current/spark2-client/python/pyspark/init.py”, line 40, in
File “/usr/local/anaconda/lib/python3.6/functools.py”, line 20, in
ImportError: cannot import name ‘get_cache_token’
ls: cannot access /usr/hdp//hadoop/lib: No such file or directory
Fatal Python error: Py_Initialize: can’t initialize sys standard streams
Traceback (most recent call last):
File “/usr/local/anaconda/lib/python3.6/io.py”, line 52, in
File “/home/sarithadsr217850/abc.py”, line 2, in
File “/usr/hdp/current/spark2-client/python/pyspark/init.py”, line 40, in
File “/usr/local/anaconda/lib/python3.6/functools.py”, line 20, in
ImportError: cannot import name ‘get_cache_token’
Aborted

The usual pyspark from console works with Python 2. There do not add Python3 to the path. If you do not execute ‘export PATH=/usr/local/anaconda/bin:$PATH’, it should work.

If you want to use the Spark with Python 3, the best way possible is by the way of using thru Jupyter Notebook.