Accessing Hive Tables with Spark

Hi,
I have set of Tables created in Hive which i wanted to access with Spark ( from Spark shell) .
is there any example that answers my query

Hi @subodhsimhaa8816,

This video should provide you more details

Thanks

1 Like

Here is what you could do to query Hive tables using pyspark

from pyspark.sql import HiveContext
hc=HiveContext(sc)
df=hc.sql(“select * from <your_schema_name>.<table_name>”);
df.show()

–to change database
hc.sql(“use <your_database_name”);

with in hc.sql you can perform all sort of sql operation that you would do normally in hive.

hope this helps!

1 Like