Regarding Spark sql metastore


#1

Hi Sandeep,
In hive, the default metastore is derbey but we can configure any RDBMS(Mysql,Oracle,Mssql) using hive-site.xml.
In similar way , how to configure an RDBMS (Mysql , Oracle , Mssql) as metastore for spark-sql.

Thanks in advance.


#2

You can place the hive-site.xml in the conf folder of spark.


#3

hi sandeep,
thanks for your response.
As you said , if i place hive-site.xml file in spark-conf folder , both hive and spark would be sharing the same schema or db.
But my intention is to create an RDMS metastore using(Mysql or oracle) that should be only specific or solely to Spark .
Can you please tell or provide a link for configuring an RDMS as metastore only specific to Spark.
thanks in advance