Pyspark session is not starting

I am getting the below error while starting pyspark session

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create file/spark2-history/local-1712830744496.inprogress. Name node is in safe mode.
The reported blocks 176118 needs additional 111 blocks to reach the threshold 1.0000 of total blocks 176228.
The number of live datanodes 3 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.

Hello,

Our lab was under maintenance. That can be the cause of this issue. Now it seems to work fine. Can you please confirm?