Spark promt(session) is getting closed accidentally

Hi,

While running the following command (predictions.take(10)) as mentioned in this tutorial, I am getting the following error message and it closes the ‘spark session’ after that:-

200]20/03/12 10:42:12 ERROR Executor: Exception in task 15.0 in stag
e 53.0 (TID 195)

Request you to please help me with the same.
Regards,
Alok

Hi. Alok.

I predict you are getting the error of “java.lang.OutOfMemoryError: Java heap space” .
Well this problem mostly happen due to memory leak, when the file/data that you are using is more than the RAM allocated to you,
Or it happens when in the configurations the specified java heap size is insufficient for the application or task to run.
I recommend you to run in your local, the code is correct to the best of my knowledge.

All the best!