Message: No suitable driver MYSQL Jupyter notebook/ spark-submit

I am trying to connect to mysql through Spark and perform some operation. I have tried following ways but getting different errors.
A. In Jupyter notebook -
Steps -

  1. %AddJar /usr/share/java/mysql-connector-java.jar -f (Also tried jar mysql-connector-java-8.0.13.jar)
  2. Added properties
    val url = “jdbc:mysql://ip-172-31-20-247:3306/retail_db”
    val table = “retail_db.City”
    val prop = new java.util.Properties()
    prop.setProperty(“username”,“someuser”)
    prop.setProperty(“password”,“spmepassword”)
  3. Class.forName(“com.mysql.cj.jdbc.Driver”)
  4. Created spark session and
    val readTable = spark.read.jdbc(url,table,prop)

Error : Name: java.sql.SQLException
Message: No suitable driver
StackTrace: at java.sql.DriverManager.getDriver(DriverManager.java:315)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:85)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:85)
at scala.Option.getOrElse(Option.scala:121)

Another Way:
Tried same code through spark-shell --driver-class-path /usr/share/java/mysql-connector-java.jar
But got connection denied error.

java.sql.SQLException: Access denied for user ‘’@‘ip-172-31-38-146.ec2.internal’ (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1078)

Please tell me what is wrong with this?

Hi @shekhar3008,

Your MySQL host is wrong. Please find the correct host under “My Lab”

Hi Abhinav,

I have used the right host earlier as well (copied it from My Lab). It just throws the error for my user name. I will write down the steps below for better understanding. Please help.

  1. Logged in to spark-shell -driver-class-path /usr/share/java/mysql-connector-java.jar
  2. Code that I have tried is as below.

val url = “jdbc:mysql://ip-172-31-20-247”
val table = “retail_db.City”
val prop = new java.util.Properties()
prop.setProperty(“username”,“myuserid”)
prop.setProperty(“password”,“mypassword”)
Class.forName(“com.mysql.cj.jdbc.Driver”)
val readTable = spark.read.jdbc(url,table,prop)

It is throwing the below error.

java.sql.SQLException: Access denied for user ‘’@‘ip-172-31-38-146.ec2.internal’ (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1078)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4187)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4119)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:927)
at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1709)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1252)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2488)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2521)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2306)
at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:839)
at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:49)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:421)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:350)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:115)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:52)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:254)