Take/Collect RDD error

Hi

I am getting below issue while I am trying to collect or take from any rdd sometime if I start new instances it is working but again it is coming back.

scala> val ordersMap = orders.map(order => {
| (order.split(",")(0).toInt, order)
| })
ordersMap: org.apache.spark.rdd.RDD[(Int, String)] = MapPartitionsRDD[4] at map at :26
scala>
scala> val orderItemsMap = orderItems.map(orderItem => {
| val oi = orderItem.split(",")
| (oi(1).toInt, oi)
| })
orderItemsMap: org.apache.spark.rdd.RDD[(Int, Array[String])] = MapPartitionsRDD[5] at map at :26
scala> ordersMap.take(10).foreach(println)
20/03/28 15:42:52 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, cxln3.c.thelab-240901.internal, executor 2): java.io.IOException: Failed to create local
dir in /hadoop/yarn/local/usercache/mearupmukherjee17025/appcache/application_1582215230149_9796/blockmgr-2f2b88e2-08ef-478c-92c4-d13fbc9370a8/13.

where are you loading orders from?