Diskspace quota exceeded while writing into HDFS

I am getting this error when trying to write a df into hdfs. Please help.
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.DSQuotaExceededException): The DiskSpace quota of /user/manavchak1039 is exceeded: quota = 4294967296 B = 4 GB
but diskspace consumed = 4460095521 B = 4.15 GB

But I can see the disk space is only 10 MB.
[manavchak1039@cxln4 ~]$ hdfs dfs -du -s -h /user/manavchak1039/
9.8 M /user/manavchak1039

Also my dataframe is only 36 MB in size.
scala> import org.apache.spark.util.SizeEstimator;

scala> dfComp.count
res9: Long = 6

scala> println(SizeEstimator.estimate(dfComp)/1024/1024 + " MB");
36 MB

Help with the above issue please.

Could you check this post: Error: Quota exceeded while running spark job but haven't used much of disk space [Solved]

Please let me know if the issue you are facing is similar.

Thank You for the suggestion. I will check this out. :slight_smile: