Sqoop import is failing due to permissions issues

sqoop import
–connect jdbc:mysql://cxln2.c.thelab-240901.internal/retail_db
–username sqoopuser
–password NHkkP876rp
–table orders
–delete-target-dir
–target-dir /user/yerranagumadhu9623/test/orders/
–as-avrodatafile
–compression-codec org.apache.hadoop.io.compress.SnappyCodec

ERROR
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.2.0-205/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.2.0-205/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/07/24 23:49:27 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.2.0-205
19/07/24 23:49:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/07/24 23:49:27 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/07/24 23:49:27 INFO tool.CodeGenTool: Beginning code generation
19/07/24 23:49:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
19/07/24 23:49:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
19/07/24 23:49:27 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.2.0-205/hadoop-mapreduce
Note: /tmp/sqoop-yerranagumadhu9623/compile/fa517d5cc9a6eb6931d9083b112e2e6d/orders.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/07/24 23:49:29 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-yerranagumadhu9623/compile/fa517d5cc9a6eb6931d9083b112e2e6d/orders.jar
19/07/24 23:49:30 INFO tool.ImportTool: Destination directory sqoop/orders is not present, hence not deleting.
19/07/24 23:49:30 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/07/24 23:49:30 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/07/24 23:49:30 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/07/24 23:49:30 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/07/24 23:49:30 INFO mapreduce.ImportJobBase: Beginning import of orders
19/07/24 23:49:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
19/07/24 23:49:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
19/07/24 23:49:30 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-yerranagumadhu9623/compile/fa517d5cc9a6eb6931d9083b112e2e6d/orders.avsc
19/07/24 23:49:30 INFO client.RMProxy: Connecting to ResourceManager at cxln2.c.thelab-240901.internal/10.142.1.2:8050
19/07/24 23:49:30 INFO client.AHSProxy: Connecting to Application History server at cxln2.c.thelab-240901.internal/10.142.1.2:10200
19/07/24 23:49:30 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: The ownership on the staging directory /user/yerranagumadhu9623/.staging is no
t as expected. It is owned by hdfs. The directory must be owned by the submitter yerranagumadhu9623 or yerranagumadhu9623
** at** org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:150)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:111)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)

hi @abhinavsingh, @sgiri can you please resolve this issue, I have test soon and some how I am unable to load the data from mysql using sqoop. I think this was due to permissions issues because you migrated the cluster and it messed up all the permissions.

Thank you,

Hi @yerranagu_madhu,

I’ve changed the owner. Apologies for replying late.