Sqoop export failing again

this is my code which i am using for sqoop export . please look into this and give me some suggestion to solve this

sqoop export --connect jdbc:mysql://cxln2.c.thelab-240901.internal/sqoopex --username sqoopuser --password NHkkP876rp --table sales_akshay -m 1 --export-dir /apps/hive/warehouse/akshay.db/sales --input-fields-terminated-by “,”
the logs are as follows:

20/09/12 17:58:52 INFO mapreduce.Job: Running job: job_1594743233823_11321
20/09/12 17:58:59 INFO mapreduce.Job: Job job_1594743233823_11321 running in uber mode : false
20/09/12 17:58:59 INFO mapreduce.Job: map 0% reduce 0%
20/09/12 17:59:05 INFO mapreduce.Job: map 100% reduce 0%
20/09/12 17:59:05 INFO mapreduce.Job: Job job_1594743233823_11321 failed with state FAILED due to: Task failed task_1594743233823_11321_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

20/09/12 17:59:06 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=1
Launched map tasks=1
Rack-local map tasks=1
Total time spent by all maps in occupied slots (ms)=38664
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=3222
Total vcore-milliseconds taken by all map tasks=3222
Total megabyte-milliseconds taken by all map tasks=4948992
20/09/12 17:59:06 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
20/09/12 17:59:06 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 15.9928 seconds (0 bytes/sec)
20/09/12 17:59:06 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
20/09/12 17:59:06 INFO mapreduce.ExportJobBase: Exported 0 records.
20/09/12 17:59:06 ERROR mapreduce.ExportJobBase: Export job failed!
20/09/12 17:59:06 ERROR tool.ExportTool: Error during export: Export job failed!

Kindly use the below syntax to export the data from Mysql to Hive table.

sqoop export --connect jdbc:mysql://cxln2.c.thelab-240901.internal/sqoopex -m 1 --table sales_sgiri --export-dir /apps/hive/warehouse/sg.db/sales_test --input-fields-terminated-by ‘,’ --username sqoopuser --password NHkkP876rp

thank you very much sir for your quick reply , i tried with your provided code but error is still there. error fir your mentioned code is as follws:
sqoop export --connect jdbc:mysql://cxln2.c.thelab-240901.internal/sqoopex -m 1 --table sales_sgiri --export-dir /apps/hive/warehouse/sg.db/sales_test --input-fields-terminated-by ‘,’ --username sqoopuser --password NHkkP876rp

one thing i forgot to mention , i manually created one file resembles with sales.log and tried to export to same table in mysql. but to my surprise it worked very well. suggestions are always welcome here …

Yes, you need to create the table in hive under your db and then you need to export to Mysql!
It should work!

in my first query
sqoop export --connect jdbc:mysql://cxln2.c.thelab-240901.internal/sqoopex --username sqoopuser --password NHkkP876rp --table sales_akshay -m 1 --export-dir /apps/hive/warehouse/akshay.db/sales --input-fields-terminated-by “,”

sales is hive table under database akshay . i populated this table with sales.log file.
sales is mysql table in sqoopex database.

Kindly use the below in your case.

sqoop export --connect jdbc:mysql://cxln2.c.thelab-240901.internal/sqoopex -m 1 --table sales_akshay --export-dir /apps/hive/warehouse/akshay.db/sales --input-fields-terminated-by ‘,’ --username sqoopuser --password NHkkP876rp