Disk space error

I followed both the below steps, but unfortunately I am still getting the disk space quota exceeded error

  1. hadoop fs -setrep -w 1 /user/gladiator2012795723
  2. Explicitly hard deleted lot of folders / files and after doing that size occupied by my folder has reduced from 421 Mb to 65 Mb (PFB screen shot for the same). Command executed to get the size - “hadoop fs -du -s -h /user/gladiator2012795723”

Kindly help to get this resolved ASAP

Hi @shah_d_p

Can you clear your .Trash folder also in HDFS?

Hi Abhinav,

I am quiet unsure whether you are able to view my folders.

However, PFB snapshot of my .trash folder, which is already empty IMO.

What bemuses me is, does it take > 5 days to understand and address the issue that I have been facing?

I would appreciate if you / your team provide required responses and resolutions w.r.t undertaken course in timely fashion.

Thanks in advance!

When are you getting the disk space quota exceeded error?

Copy pasting answer to the same question from other thread which is also open since last 4 days (Disk space error while running spark load and save data section)

I am facing this issue whilst running examples as part of ’ Loading and Saving Data - Handling Sequence and Object Files’ section i.e. https://cloudxlab.com/assessment/displayslide/593/loading-and-saving-data-handling-sequence-and-object-files?course_id=68&playlist_id=350

Let me know if you need any additional info.

Hi @sgiri

Any update on my below query?

Hi @shah_d_p,

Can you please take a look into this discussion Error: Quota exceeded while running spark job but haven't used much of disk space