Unable to create hive external table seeing permission issue

Hi Team,

I am seeing permission issues while trying to create a Hive external table on cloudxlab

create external table project2.web_log_handler_tar (id int,username string,sub_port string,host string,year string,month string,day string,hit_count_val_1 string,hit_count_val_2 string,hit_count_val_3 string,timezone string,method string,product string,value string,sub_product string,web_info string,status_code int) stored by ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’ with SERDEPROPERTIES(“hbase.columns.mapping”=":key,host_v:username,host_v:sub_port,host_v:host,host_v:year,host_v:month,host_v:day,host_v:hit_count_val_1,host_v:hit_count_val_2,host_v:hit_count_val_3,host_v:timezone,host_v:method,host_v:product,host_v:value,host_v:sub_product,host_v:web_info,host_v:status_code") TBLPROPERTIES(“hbase.table.name”=“web_daily_tar”)

here is the error

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied:
user=abhishekarjs1096, access=WRITE, inode="/apps/hive/warehouse/sparkhive.db":NULL:hadoop:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8750)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)

I am not creating location here. Please help me on this

Hi, Abhishek.

May I know where you are trying to create the Hive table?
Kindly use the hive terminal for web-console for all your Hive task.
You will be able to do it. Still getting error kindly send the screenshots.

All the best!

Yes this error persists and I have now posted same error to tech team.

CREATE EXTERNAL TABLE IF NOT EXISTS State_Testing(
seq INT,
ddate STRING,
state STRING,
total_samples INT,
negative INT,
positive INT
)
comment’Table to Store Statewise Testing Details’
row format delimited fields terminated by ‘,’
stored as textfile
location ‘/apps/user/rahultinku38883011/rahul/data1/sqoop_imported/State_Testing’;

location ‘/apps/user/rahultinku38883011/rahul/data1/sqoop_imported/State_Testing’;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission deni
ed: user=rahultinku38883011, access=WRITE, inode="/apps":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8750)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1466)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)
)
hive>

Please help I have to creste a table for my assignment
Thank You…!!!

Hi @Rahul_Tinku

You have write access only in your home directory in HDFS. Please use the below location /user/rahultinku38883011/

1 Like