Thank you @raviteja for answer.
In addition to @raviteja, I have to add is that I just retried the same commands, it worked fine. I guess the hive table mini27a.dep_sql was already created. See below:
[sandeepgiri9034@ip-172-31-60-179 ~]$ sqoop import --connect jdbc:mysql://ip-172-31-13-154/retail_db --username sqoopuser --password NHkkP876rp --table departments --hive-import --hive-table mini27a.dep_sql --hive-overwrite
18/04/02 02:08:01 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
18/04/02 02:08:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/04/02 02:08:01 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
18/04/02 02:08:01 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
18/04/02 02:08:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/04/02 02:08:01 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/04/02 02:08:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
18/04/02 02:08:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
18/04/02 02:08:02 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.4.0-3485/hadoop-mapreduce
Note: /tmp/sqoop-sandeepgiri9034/compile/0118f9105fa8e1eb5fbbdc9fc73a5567/departments.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/04/02 02:08:03 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-sandeepgiri9034/compile/0118f9105fa8e1eb5fbbdc9fc73a5567/departments.jar
18/04/02 02:08:03 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/04/02 02:08:03 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/04/02 02:08:03 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/04/02 02:08:03 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/04/02 02:08:03 INFO mapreduce.ImportJobBase: Beginning import of departments
18/04/02 02:08:05 INFO impl.TimelineClientImpl: Timeline service address: http://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
18/04/02 02:08:05 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
18/04/02 02:08:07 INFO db.DBInputFormat: Using read commited transaction isolation
18/04/02 02:08:07 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`department_id`), MAX(`department_id`) FROM `departments`
18/04/02 02:08:07 INFO mapreduce.JobSubmitter: number of splits:4
18/04/02 02:08:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1517296050843_10996
18/04/02 02:08:07 INFO impl.YarnClientImpl: Submitted application application_1517296050843_10996
18/04/02 02:08:07 INFO mapreduce.Job: The url to track the job: http://a.cloudxlab.com:8088/proxy/application_1517296050843_10996/
18/04/02 02:08:07 INFO mapreduce.Job: Running job: job_1517296050843_10996
18/04/02 02:08:13 INFO mapreduce.Job: Job job_1517296050843_10996 running in uber mode : false
18/04/02 02:08:13 INFO mapreduce.Job: map 0% reduce 0%
18/04/02 02:08:19 INFO mapreduce.Job: map 100% reduce 0%
18/04/02 02:08:19 INFO mapreduce.Job: Job job_1517296050843_10996 completed successfully
18/04/02 02:08:20 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=598376
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=481
HDFS: Number of bytes written=69
HDFS: Number of read operations=16
HDFS: Number of large read operations=0
HDFS: Number of write operations=8
Job Counters
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=40503
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=13501
Total vcore-seconds taken by all map tasks=13501
Total megabyte-seconds taken by all map tasks=20737536
Map-Reduce Framework
Map input records=7
Map output records=7
Input split bytes=481
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=430
CPU time spent (ms)=6120
Physical memory (bytes) snapshot=882032640
Virtual memory (bytes) snapshot=12978053120
Total committed heap usage (bytes)=715128832
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=69
18/04/02 02:08:20 INFO mapreduce.ImportJobBase: Transferred 69 bytes in 15.2499 seconds (4.5246 bytes/sec)
18/04/02 02:08:20 INFO mapreduce.ImportJobBase: Retrieved 7 records.
18/04/02 02:08:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
18/04/02 02:08:20 INFO hive.HiveImport: Loading uploaded data into Hive
Logging initialized using configuration in jar:file:/usr/hdp/2.3.4.0-3485/hive/lib/hive-common-1.2.1.2.3.4.0-3485.jar!/hive-log4j.properties
OK
Time taken: 1.331 seconds
Loading data to table mini27a.dep_sql
Moved: 'hdfs://ip-172-31-53-48.ec2.internal:8020/apps/hive/warehouse/mini27a.db/dep_sql/part-m-00000' to trash at: hdfs://ip-172-31-53-48.ec2.internal:8020/user/sandeepgiri9034/.Trash/Current
Moved: 'hdfs://ip-172-31-53-48.ec2.internal:8020/apps/hive/warehouse/mini27a.db/dep_sql/part-m-00001' to trash at: hdfs://ip-172-31-53-48.ec2.internal:8020/user/sandeepgiri9034/.Trash/Current
Moved: 'hdfs://ip-172-31-53-48.ec2.internal:8020/apps/hive/warehouse/mini27a.db/dep_sql/part-m-00002' to trash at: hdfs://ip-172-31-53-48.ec2.internal:8020/user/sandeepgiri9034/.Trash/Current
Moved: 'hdfs://ip-172-31-53-48.ec2.internal:8020/apps/hive/warehouse/mini27a.db/dep_sql/part-m-00003' to trash at: hdfs://ip-172-31-53-48.ec2.internal:8020/user/sandeepgiri9034/.Trash/Current
Table mini27a.dep_sql stats: [numFiles=4, numRows=0, totalSize=69, rawDataSize=0]
OK
Time taken: 0.69 seconds