Spark process not able to create write file on HDFS , getting permission denied

Hi ,

In my HDFS I have created path /home/madhavink145361/tes
ting/spark-warehouse and updated permissions for both directories. but still process is noot able to access HDFS. I am getting below errors :

Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=madhavink145361, access=WRITE, inode="/home/madhavink145361/tes
ting/spark-warehouse":hdfs:hdfs:drwxr-xr-x

18/05/04 14:17:08 INFO internal.SharedState: Warehouse path is ‘/home/madhavink145361/testing/spark-warehouse’.
18/05/04 14:17:10 WARN datasources.DataSource: Error while looking for metadata directory.
Exception in thread “main” org.apache.spark.SparkException: Unable to create database default as failed to create its directory hdfs://ip-172-31-53-4
8.ec2.internal:8020/home/madhavink145361/testing/spark-warehouse

Could you please suggest?