How to run Spark2 Action in oozie

I specified a Spark Action with Oozie. I get the following error:

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(), java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2308)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:229)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2214)
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2306)

My Workflow.xml file looks

<workflow-app name="OnlySpark-2" xmlns="uri:oozie:workflow:0.5">
	<start to="spark-2e98"/>
	<kill name="Kill">
		<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
	</kill>
	<action name="spark-2e98">
		<spark xmlns="uri:oozie:spark-action:0.1">
			<job-tracker>${jobTracker}</job-tracker>
			<name-node>${nameNode}</name-node>
			<configuration>
				<property>
					<name>oozie.action.sharelib.for.spark</name>
					<value>spark2/</value>
				</property>
			</configuration>
			<master>local[*]</master>
			<mode>client</mode>
			<name>MySpark-2</name>
			<class>CategoryPrediction.HelloWorld</class>
			<jar>News_Category-0.0.1-SNAPSHOT.jar</jar>
		</spark>
		<ok to="End"/>
		<error to="Kill"/>
	</action>
	<end name="End"/>
</workflow-app>

job.properties

oozie.use.system.libpath=True
send_email=False
dryrun=False
nameNode=hdfs://ip-172-31-35-141.ec2.internal:8020
jobTracker=ip-172-31-35-141.ec2.internal:8050
security_enabled=False
oozie.wf.application.path=hdfs://ip-172-31-35-141.ec2.internal:8020/user/tarunkumar_13516/oozieExample
oozie.action.sharelib.for.spark=spark2

oozie command

oozie job -oozie http://ip-172-31-20-247.ec2.internal:11000/oozie -config job.properties -run

What I already did:
I have submitted the spark job through spark-submit and it worked without error.

I am also facing same issue

Please do let us know whether we can execute Spark 2 via Oozie

same issue… can you please help?

These guide helped me resolve the same issue -

https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.5/bk_spark-component-guide/content/ch_oozie-spark-action.html