Out of memory space

do i have to remove some of the files to make memory space??

If yes then all my work will be lost then how will i claim my certification??
I am currently on the MACHINE LEARNING WITH SPARK and doing the handson which is provided in the vedio but getting the above error…
the above error is showing up but i have still 300GB of the disk size left as shown in the image attached .

please increase the disc space as we have to practice also…

Hi, Anubhav.

Yes, you are right your heap memory is full.
All the users are given a RAM of 2GB to run all the program and that is sufficient. Kindly read the below discussion for more details :-
https://cloudxlab.com/faq/6/what-are-the-limits-on-the-usage-of-lab-or-what-is-the-fair-usage-policy-fup
and Error: Quota exceeded while running spark job but haven't used much of disk space

You can try to delete some unused files from your HDFS or local or use a file with less size.
hdfs dfsadmin --> will give the information about the entire cluster or user information and the access is only with the Superusers or Admin.

All the best!

but if i delete some the files that i created and then at the time of certification i will not be having those files to claim the certificate.

how to deal with this issue??