Hi Team,
I am seeing permission issues while trying to create a Hive external table on cloudxlab
create external table project2.web_log_handler_tar (id int,username string,sub_port string,host string,year string,month string,day string,hit_count_val_1 string,hit_count_val_2 string,hit_count_val_3 string,timezone string,method string,product string,value string,sub_product string,web_info string,status_code int) stored by ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’ with SERDEPROPERTIES(“hbase.columns.mapping”=":key,host_v:username,host_v:sub_port,host_v:host,host_v:year,host_v:month,host_v:day,host_v:hit_count_val_1,host_v:hit_count_val_2,host_v:hit_count_val_3,host_v:timezone,host_v:method,host_v:product,host_v:value,host_v:sub_product,host_v:web_info,host_v:status_code") TBLPROPERTIES(“hbase.table.name”=“web_daily_tar”)
here is the error
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied:
user=abhishekarjs1096, access=WRITE, inode="/apps/hive/warehouse/sparkhive.db":NULL:hadoop:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8750)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
I am not creating location here. Please help me on this