The DiskSpace quota of user is execeeded

where my data is 2 GB and i am not able to laod to hdfs system in my space and faving error of disk space execeeded could you pls quickly help me to resolve the issue .

load data local inpath ‘/home/veerayyakumarg1811/parkingviolations.csv’ into table nyc_parking_violations;
Loading data to table veerugandhad.nyc_parking_violations
Failed with exception org.apache.hadoop.hdfs.protocol.DSQuotaExceededException: The DiskSpace quota of /user/veerayyakumarg1811 is exceeded: quota = 4294967296 B = 4 GB but diskspace c
onsumed = 4575774612 B = 4.26 GB
at org.apache.hadoop.hdfs.server.namenode.DirectoryWithQuotaFeature.verifyStoragespaceQuota(
at org.apache.hadoop.hdfs.server.namenode.DirectoryWithQuotaFeature.verifyQuota(
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.verifyQuota(
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.updateCount(
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.updateCount(
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.addBlock(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.saveAllocatedBlock(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.storeAllocatedBlock(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$
at org.apache.hadoop.ipc.RPC$
at org.apache.hadoop.ipc.Server$Handler$
at org.apache.hadoop.ipc.Server$Handler$
at Method)
at org.apache.hadoop.ipc.Server$
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask

Hi, Verru.

I understand that your file size is 2GB but there may be other file in your allocated clusters which are consuming your allocated space.

as the error is self explanatory that “The DiskSpace quota of /user/veerayyakumarg1811 is exceeded: quota = 4294967296 B = 4 GB but disk space consumed = 4575774612 B = 4.26 GB”.

Kindly follow the bellow command for finding out the remaining space and deleting the unwanted files.

  1. Run the following command to get the disk-space usage in your home directory:

–> hdfs dfs -count -q -h /user/your_user_name.

  1. If the size reported by the command above is more than 3G (or 3072M) then do the below steps.

a. For deleting files in your home directory you can use the following command to delete a file:

–> hdfs dfs -rm -r /path/to/directory

  1. Check the remaining space using the the command.

–> hdfs dfs -count -q -h /user/your_user_name

The allocated HDFS space is 4GB and replication factor is 3, and there are 16 processor running which are creating 16 partitions and let the block size is 128MB.

The 16 partitions due to 16 processor. 128163 MB = 6GB and >4GB quota exceed.
Hence the space quota is getting full.

Kindly also refer the FAIR Usage policy for the learners here :-,for%20educational%20and%20PoC%20purposes.&text=Here%20are%20the%20limits%20as,the%20replication%20factor%20of%203.

All the best!

Hi Satyaji Das,
I tried to follow the steps mentioned above to remove the unwanted files , now i only have empty folders but then still Disk Quota is shown as full.

Can you please assist.

May I know which task you are doing and what is the size of the file you are trying?

Here somehow 16 processor are running while running the command and each forming it’s own files and replications factor is 3 so it is going beyond the allocated space.

We can try to limit the processor used in hadoop command in main configurations file, it might help.

The below article can help you to understand the error.

and below will help you to know yours actions while doing commands.

Try to create new file and command and send the command you are using I will also try from my end.

All the best!

none inf 7 G 256 M 6 3
2.3 G /user/rishav78226336

my file size is 30MB then also things are not working

Hi Rishabh,

Can you share a screenshot regarding the issue?