Memory allocation error in cloudxlab cluster

Hi,

We are getting the below error on most of our Ids (We have subscribed to 7 ids since a month).
[prasuna7718@ip-172-31-38-146 ~]$ hdfs dfs -ls
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000c0000000, 351272960, 0) failed; error=‘Cannot allocate memory’ (errno=12)

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (mmap) failed to map 351272960 bytes for committing reserved memory.

An error report file with more information is saved as:

/home/prasuna7718/hs_err_pid19604.log

Because of this, we are unable to work on hdfs.

Please rectify this ASAP.

Thanks in advance.

Hi @Prasuna_A,

We are optimizing cluster for memory usages and putting up the memory quotas so that single user cannot exhaust the entire available memory.

Please allow us some time for the same. In the meanwhile, we are cleaning up the processes.

Hope this helps.

Thanks