Spark shell showing ERror

[sarithadsr217850@cxln4 ~]$ spark_shell
-bash: spark_shell: command not found
[sarithadsr217850@cxln4 ~]$ spark-shell
SPARK_MAJOR_VERSION is set to 2, using Spark2
File “/bin/hdp-select”, line 232
print "ERROR: Invalid package - " + name
^
SyntaxError: Missing parentheses in call to ‘print’. Did you mean print("ERROR: Invalid package - " + name)?
ls: cannot access /usr/hdp//hadoop/lib: No such file or directory
Exception in thread “main” java.lang.IllegalStateException: hdp.version is not set while running Spark under HDP, please set through HDP_VERSION in spark
-env.sh or add a java-opts file in conf with -Dhdp.version=xxx
at org.apache.spark.launcher.Main.main(Main.java:118)

[sarithadsr217850@cxln4 ~]$ pyspark
SPARK_MAJOR_VERSION is set to 2, using Spark2
File “/bin/hdp-select”, line 232
print "ERROR: Invalid package - " + name
^
SyntaxError: Missing parentheses in call to ‘print’. Did you mean print("ERROR: Invalid package - " + name)?
Fatal Python error: Py_Initialize: can’t initialize sys standard streams
Traceback (most recent call last):
File “/usr/local/anaconda/lib/python3.6/io.py”, line 52, in
File “/home/sarithadsr217850/abc.py”, line 2, in
File “/usr/hdp/current/spark2-client/python/pyspark/init.py”, line 40, in
File “/usr/local/anaconda/lib/python3.6/functools.py”, line 20, in
ImportError: cannot import name ‘get_cache_token’
ls: cannot access /usr/hdp//hadoop/lib: No such file or directory
Fatal Python error: Py_Initialize: can’t initialize sys standard streams
Traceback (most recent call last):
File “/usr/local/anaconda/lib/python3.6/io.py”, line 52, in
File “/home/sarithadsr217850/abc.py”, line 2, in
File “/usr/hdp/current/spark2-client/python/pyspark/init.py”, line 40, in
File “/usr/local/anaconda/lib/python3.6/functools.py”, line 20, in
ImportError: cannot import name ‘get_cache_token’
Aborted

For me it is working fine:

sandeep$ ssh sandeepXXXXXXX@e.cloudxlab.com
Last login: Sun Oct 11 18:17:32 2020 from 49.207.200.247
-bash: warning: setlocale: LC_CTYPE: cannot change locale (UTF-8): No such file or directory
[sandeepgiri9034@cxln4 ~]$ spark-shell
SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://10.142.1.4:4040
Spark context available as 'sc' (master = local[*], app id = local-1602814505152).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.1.2.6.2.0-205
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

Please note that I am not setting any PATH or any other library before starting spark.