Error in pyspark session

Hi,

When I run the command ‘ss1.catalog.listDatabases()’ I get error “org.apache.spark.SparkException: Unable to create database default as failed to create its directory /app/hive/output_data/derby_nonpersistent”

I have tried logging out of the session and running below commands also-

ss1.stop()
ss1.sparkContext.stop()
ss1.stop()

Also this time I am creating Spark Session with below command-
ss1=SparkSession.builder.getOrCreate()

Still I am getting the above error.
However ‘ss1.catalog.currentDatabase()’ runs without any error.

Please help me asap.

Thanks,
Amit

Hi Amit,

It is working fine in your notebook when I checked. Are you still getting this error?

Thanks Shubh !!!

Yes It is working now.
What was the issue and how was it resolved?