org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 53.0 failed 4 times,

Whenever I’m executing commands related to Spark DF, facing the below errors
Commands that Im running are Eg: prodDF.show , prodDF.count etc
org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 53.0 failed 4 times, most recent failure: Lost task 1.3 in stage 53.0 (TID 133,
ip-172-31-20-58.ec2.internal): java.io.FileNotFoundException: /hadoop/yarn/local/usercache/sekharcherukupalli8827/appcache/application_1517296050843_2342/blockmgr-
04cfd0b1-1d8d-49a7-9818-129fdc4f7712/3a/temp_shuffle_4e1716a9-f6eb-4fbf-82e6-505098c7986e (No such file or directory)

This issue is resolved now.
Looks like resource issue.
Thanks