Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to check if a Hive table exists using PySpark

I want to check if a table schemaname.tablename exists in Hive using pysparkSQL.

There is an option in Scala spark.catalog.tableExists("schemaname.tablename").However , same functionality not available through pySpark .

Looking for a quick and clean approach to check if Hive table exists using PySpark

like image 435
user2717470 Avatar asked Sep 23 '19 17:09

user2717470


1 Answers

pyspark.sql.catalog module is included from spark >= 2.3.0

If you are using spark < 2.3.0 the you can use like below:

spark._jsparkSession.catalog().tableExists("schema.table")
True

spark._jsparkSession.catalog().tableExists("schema.table_false")
False

or

spark.catalog._jcatalog.tableExists("schema.table")
True
like image 69
SMaZ Avatar answered Sep 19 '22 15:09

SMaZ