Spark code failing from sbt shell

When I am scala code running the code from sbt shell, it gives the following error:
java.lang.IllegalArgumentException: Error while instantiating ‘org.apache.spark.sql.hive.HiveSessionState’:
Caused by: java.lang.IllegalArgumentException: Error while instantiating ‘org.apache.spark.sql.hive.HiveExternalCatalog’:
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x

build.sbt
scalaVersion := “2.11.8”
libraryDependencies += “org.apache.spark” % “spark-core_2.11” % “2.1.1”
libraryDependencies += “org.apache.spark” % “spark-sql_2.11” % “2.1.1”
libraryDependencies += “org.apache.spark” % “spark-hive_2.11” % “2.1.1”

scala code
import org.apache.spark.{SparkConf, SparkContext};
import org.apache.spark.sql.{SparkSession, SQLContext};

inside main:
val conf = new SparkConf().setAppName(“HiveLoad”).setMaster(“local[*]”);
val sc = new SparkContext(conf);
val warehouseLocation = “/apps/hive/warehouse”
val spark = SparkSession.builder().appName(“test”).config(“spark.sql.warehouse.dir”, warehouseLocation).enableHiveSupport().getOrCreate();
val sqlContext= new org.apache.spark.sql.SQLContext(sc);
import sqlContext.implicits._;

My underlying issue is that I am not able to access Hive from spark.

I can provide the entire log if required. Do not want to make this post too long by pasting the logs here.

Help with the above issue please.