Quota exceeded - error - while running spark job

org.apache.hadoop.ipc.RemoteException: The DiskSpace quota of /user/bdm05848695 is exceeded: quota = 4294967296 B = 4 GB but diskspace consumed = 4429189317 B = 4.13 GB

I tried running,

hadoop fs -expunge

I am getting " Superuser privilege is required"

Kindly help.

Also, is it important to maintain files in the hadoop user directory, will that be considered for certification ?

org.apache.hadoop.ipc.RemoteException: The DiskSpace quota of /user/bdm05848695 is exceeded: quota = 4294967296 B = 4 GB but diskspace consumed = 4429189317 B = 4.13 GB

Please either reduce the replication factor of the files or delete these.

I am getting " Superuser privilege is required"

It is very strange that it requires superuser privilege. I never noticed it before. Have to tried cleaning .Trash folder using hadoop fs command or using Hue?

Also, is it important to maintain files in the hadoop user directory, will that be considered for certification?

No.

HI,
Thanks for the reply. When I do , hadoop fs -expunge, at command I am getting super user priviliege needed error. I have attached screen shot.