Error when using latest spark-shell command

Hi ,
I am facing this problem .I had faced this problem earlier as well but
I am not sure why this question was not answered.

When using the latest Spark-shell version 2.2.1 the system throws these errors
val myRange = spark.range(1000).toDF(“number”)

java.lang.IllegalArgumentException: Error while instantiating ‘org.apache.spark.sql.hive.HiveSessionStateBuilder’:
** LOTS OF ERROR**
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
Caused by: java.lang.reflect.InvocationTargetException: java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning
** LOTS OF ERROR**
Caused by: java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning
** LOTS OF ERROR**
Caused by: java.lang.ClassNotFoundException: org.apache.tez.dag.api.SessionNotRunning
** LOTS OF ERROR**

Please help in answering this question… @abhinav @sandeepgiri

The error is also present when using the pyspark.

>>> myRange = spark.range(1000).toDF(“number”)
Traceback (most recent call last):
** File “”, line 1, in **
NameError: name ‘spark’ is not defined

But I am not sure about the version of the pyspark that has been used in this.

I am executing the examples from the book Spark - Definitive guide.
Thanks

Hi @utkarsh_rathor,

Are you running it using spark-submit or spark-shell?

It is working fine for me so just curious. I ran this on e.cloudxlab.com

/usr/spark2.2.0/bin/spark-shell

41%20AM

@abhinavsingh @sandeepgiri
I am facing the same problem again …
I am not sure how this is possible ?

Do we need to set some path or some setting before running this command ?
Maybe you are using different user to access the spark-shell which would be having the required jars on the path?
Maybe some other reason of system administration that I don’t know about

IN MY HUMBLE OPINION YOU WON"T GET ERROR WHEN YOU ARE USING YOUR ACCOUNT TO INTERACT WTH THE SHELL BUT WOULD GET IT WHEN YOU USE SOMEONE’S ELSE’S LOGIN

I am posting a video of this on youtube please have a look

I am also facing same issue with spark 2.2.1

@Srinivasa_Reddy
Thanks for contributing to this. I thought I am the only one facing this.

Hi @utkarsh_rathor and @Srinivasa_Reddy

Seems like Spark 2.2.1 is not tightly integrated with Hadoop.

If possible, can you please use Spark 2.0.2 for your practice sessions?

Thanks

Thanks Abhinav.

For now i will use spark 2.0.2.
May i know when this issue will be fixed ?

Thanks,
Srinivas