How can I use the latest Spark 2.3.0 Version in CloudxLab

How can I use the latest Spark 2.3.0 Version in CloudxLab

If you are using Python Jupyter Notebook you can use findspark:


import findspark
# Find all other installation using !ls /usr/
#findspark.init('/usr/spark2.3')
#findspark.init('/usr/spark2.3')
#findspark.init('/usr/spark2.3')
findspark.init('/usr/spark2.4.3')

import pyspark # only run this after findspark.init()
from pyspark.sql import SparkSession, SQLContext
from pyspark.context import SparkContext
from pyspark.sql.functions import * 
from pyspark.sql.types import * 

spark = SparkSession.builder.appName("PysparkExample").getOrCreate()
sc = spark.sparkContext