Disk space error while running spark load and save data section

I followed both the below steps, but unfortunately I am still getting the disk space quota exceeded error

  1. hadoop fs -setrep -w 1 /user/gladiator2012795723
  2. Explicitly hard deleted lot of folders / files and after doing that size occupied by my folder has reduced from 421 Mb to 65 Mb (PFB screen shot for the same). Command executed to get the size - “hadoop fs -du -s -h /user/gladiator2012795723”

hadoop%20disk%20space%20cmd
hadoop disk space cmd.png693×83 4.1 KB

Kindly help to get this resolved ASAP

P.S - Since I am unable to get timely response, I am required to re-post the same thread

Hi, Shah.

May I know which exercise or question you are referring to? so that I can look into it Can you put a link of it?

All the best!

I am facing this issue whilst running examples as part of ’ Loading and Saving Data - Handling Sequence and Object Files’ section i.e. https://cloudxlab.com/assessment/displayslide/593/loading-and-saving-data-handling-sequence-and-object-files?course_id=68&playlist_id=350

Let me know if you need any additional info.

Hi @shah_d_p

Can you please take a look into this discussion Error: Quota exceeded while running spark job but haven’t used much of disk space