Spark Sql from pyspark

Hi,

After creating a DF from CreateOrReplaceTempView I am trying to use spark sql. I am using jupyter.

df.createTempView(‘people’)
result= spark.sql(“select * from people”)

After this I am getting an error “AttributeError: ‘DataFrame’ object has no attribute ‘sql’”

Can anyone suggest.