Save as avro data file command fails in sqoop

Hi,

I tried the below command
sqoop import --connect “jdbc:mysql://ip-172-31-20-247:3306/retail_db” --table products --as-avrodatafile --username sqoopuser -P

The below error is thrown.
18/09/18 08:18:00 INFO db.IntegerSplitter: Split size: 336; Num splits: 4 from: 1 to: 1345
18/09/18 08:18:01 INFO mapreduce.JobSubmitter: number of splits:4
18/09/18 08:18:01 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1534772926501_7374
18/09/18 08:18:01 INFO impl.YarnClientImpl: Submitted application application_1534772926501_7374
18/09/18 08:18:01 INFO mapreduce.Job: The url to track the job: http://ip-172-31-35-141.ec2.internal:8088/proxy/application_1534772926501_7374/
18/09/18 08:18:01 INFO mapreduce.Job: Running job: job_1534772926501_7374
18/09/18 08:18:07 INFO mapreduce.Job: Job job_1534772926501_7374 running in uber mode : false
18/09/18 08:18:07 INFO mapreduce.Job: map 0% reduce 0%
18/09/18 08:18:11 INFO mapreduce.Job: Task Id : attempt_1534772926501_7374_m_000002_0, Status : FAILED

Please help

I am facing the same challenge. Need help.

sqoop import --connect jdbc:mysql://172.31.20.247/retail_db --username sqoopuser -P --table order_items --as-avrodatafile

18/09/22 02:02:08 INFO mapreduce.Job: Running job: job_1534772926501_8160
18/09/22 02:02:14 INFO mapreduce.Job: Job job_1534772926501_8160 running in uber mode : false
18/09/22 02:02:14 INFO mapreduce.Job: map 0% reduce 0%
18/09/22 02:02:18 INFO mapreduce.Job: Task Id : attempt_1534772926501_8160_m_000000_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
18/09/22 02:02:19 INFO mapreduce.Job: Task Id : attempt_1534772926501_8160_m_000003_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

18/09/22 02:02:19 INFO mapreduce.Job: Task Id : attempt_1534772926501_8160_m_000001_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

1 Like

Hi @Prasuna_A @SHAMIK_S,

Did you check the error on Hortonworks support?

This command should work

sqoop import -Dmapreduce.job.user.classpath.first=true --connect "jdbc:mysql://ip-172-31-20-247:3306/retail_db" --table products --as-avrodatafile --username sqoopuser -P

3 Likes

Thanks Abhinav. It worked.

Is this possible to add mapreduce.job.user.classpath.first property in the mapped-site.xml? I think, then it will not be required to set, while executing sqoop command.

Thanks Abhinav. This command worked.

@abhinavsingh why that parameter is required when selecting --as-avrodatafile in sqoop import? It will be helpful to understand the concept

worked like a charm
Thanks for sharing the solution