pyspark - Adding jars

Please advise how can I configure my sparksession in my python 3 jupyter notebook to call mysql jar “mysql_mysql-connector-java-8.0.11.jar” and avro jar “com.databricks:spark-avro_2.11:4.0.0”.
I am unable to read/write an avro file using sparksession and connect to mysq using sparksession.

Thanks
Vikram

Hi Vikaram,

While launching the spark prompt, you can use the --jars argument or you can use --package.

To know how to user spark from the notebook, please check this: https://cloudxlab.com/blog/running-pyspark-jupyter-notebook/ . In this blogpost, you will also notice how to add extra libs to path.