SPARK UI unavailable

when I start spark shell using command
spark-shell

I see as below

SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://10.142.1.4:4040
Spark context available as ‘sc’ (master = local[*], app id = local-1570732770075).
Spark session available as ‘spark’.

but the Spark UI url is not accessible. I have tried 4-5 times I got different port numbers also, but still could not connect.

Can you please explain this ?

Launch spark shell and notice the port number and IP address. In my case port number is 4041 and ip address is 10.142.1.4.

From “My Lab”, figure out the public IP address:

Now, using the public IP address and port number, open the url in the browser.

I used the same process but I got

Is it blocked ? Please help me solve this

You can check your job status in Hue job browser

Thanks for the quick reply!!
But I don’t find all the information that I used to get from Spark UI.
For example: Job, tasks level information and also SQL / Streaming tabs. Which would really help us in debug/tune our codes