Running Spark2.0 Jobs on YARN

Hi Abhinav/Giri

I am able to run Spark2.0 jobs on local mode…but same script trying to run on YARN …it is not working…getting following error…

spark = SparkSession
.builder
.master(“yarn”)
.config(“spark.sql.warehouse.dir”,“hdfs:///user/kranthidr5051/spark_warehouse”)
.appName(“test”)
.getOrCreate()

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:150)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
at org.apache.spark.SparkContext.(SparkContext.scala:497)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:236)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 20 more

Seems to issue with YARN timeline settings…Could Please help me on this?

Hi,

Spark 2.x will not work in YARN mode currently. It is a pending issue which will solve during the cluster upgrade.

We will let you know as soon as we will upgrade the cluster. Extremely sorry for the inconvenience.

Thanks

@abhinavsingh,

Requesting you to upgrade Spark 2.x for YARN as early as possible.
Thanks for understanding.

Hi Abhinav,

Any update on this? Is it possible now to run Spark 2.0 jobs on Yarn?

1 Like

Hi Senthil,

It should be available within a week. This is exactly what we are working on right now.

Hi sgiri,

Not able to launch spark 2.x in Yarn mode, do you have any update for the same?

Thanks
Saurabh