Spark context vs Spark session

Please explain Spark context vs Spark session.
It is little bit confusing for me.

Thank you.

SparkContext was mainly used in Spark 1.x but Spark 2.x, the initialization is done by the way of SparkSession.

Instance of SparkContext sc was the main entry point in Spark 1.x shell. In the Spark 2.x shell, the main entry point is “spark” an object of SparkSession. Please note that spark.sparkContext in the newer version is available for backward compatibility and direct operations on RDD.