Getting error while logging in

I am facing a problem

Some days I get the access to a.cloudxlab.com but some days I get error
which is
Permission denied (publickey,gssapi-keyex,gssapi-with-mic).
I require this information because while loading the table using sqoop I am getting
the error

FileAlreadyExistsException: Output directory hdfs://ip-172-31-53-48.ec2.internal:8020/user/uutkarshsingh7351/ztutorials_tbl already exists

where I need to delete the above generated file , but the permission is denied for the access

Hi Utkarsh,

Some days I get the access to a.cloudxlab.com but some days I get error which is
Permission denied (publickey,gssapi-keyex,gssapi-with-mic).

The a.cloudxlab.com runs the namenode, so ssh access on this machine is not allowed. While you can login to the web-console or terminal of jupyter or usual SSH on e/f/g.

FileAlreadyExistsException: Output directory hdfs://ip-172-31-5348.ec2.internal:8020/user/uutkarshsingh7351/ztutorials_tbl already exists

Please take a look at the HDFS using File Browser in the HUE or using hadoop fs -ls.

When you are using Sqoop to copy the data from Sqoop to a folder in HDFS, the target folder in HDFS must not exist.

You might need to delete ztutorials_tbl folder in hdfs using either File Browser in Hue or using hadoop fs -rm -r ztutorials_tbl.

I hope that helps.

Cheers!!

Thanks, I got it now @sgiri
I am also not allowed to log in to a,b,c,d as well. What is the significance of the other nodes(b,c,d) ?

Hi Utkarsh,

b is being used for YARN resource manager primarily. c is used for hive and other services. While d is for Hue more or less.

Also, a, b, c and d are also running other masters (such as Hbase) and the backups of services.