No suitable driver found for jdbc:mysql

Tried to create a dataframes using JDBC to Mysql, below error occured.
Could you please help me out.

orderitems =
… format(‘jdbc’).
… option(‘url’, ‘jdbc:mysql://cxln2.c.thelab-240901.internal:3306’).
… option(‘dbtable’, ‘reatil_db.order_items’).
… option(‘user’, ‘sqoopuser’).
… option(‘password’, ‘NHkkP876rp’).
… load()
Traceback (most recent call last):
File “”, line 6, in
File “/usr/hdp/current/spark2-client/python/pyspark/sql/”, line 155, in load
return self._df(self._jreader.load())
File “/usr/hdp/current/spark2-client/python/lib/”, line 1133, in call
File “/usr/hdp/current/spark2-client/python/pyspark/sql/”, line 63, in deco
return f(*a, **kw)
File “/usr/hdp/current/spark2-client/python/lib/”, line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o47.load.
: java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)

Sir, did you find your solution? I am facing the same issue.


Can you please let us know which Spark Version are you using on CloudxLab so that we can assist you better?

I feel, we may have to pass the MySQL JAR too while launching Spark