Spark SQL command error

Hello,

I am trying to below command using sqlContext but it is throwing and error.

Command : sqlContext.sql(“SELECT o.order_date, p.product_name, round(sum(oi.order_item_subtotal),2) order_revenue FROM ORDERS o join ORDER_ITEMS oi” +
“on o.order_id=oi.order_item_order_id” +
“join products p on p.product_id=oi.order_item_product_id” +
“WHERE o.status IN (‘COMPLETE’,‘CLOSED’)” +
“group by o.order_date,p.product_name” +
“order by o.order_date, order_revenue desc”)

Error : org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input ‘o’ expecting (line 1, pos 109)

The small statement executed on Hive is working fine.

Any help is much appreciated!

Thanks