I want to check if a table schemaname.tablename
exists in Hive using pysparkSQL
.
There is an option in Scala spark.catalog.tableExists("schemaname.tablename")
.However , same functionality not available through pySpark
.
Looking for a quick and clean approach to check if Hive table exists using PySpark
pyspark.sql.catalog
module is included from spark >= 2.3.0
If you are using spark < 2.3.0 the you can use like below:
spark._jsparkSession.catalog().tableExists("schema.table")
True
spark._jsparkSession.catalog().tableExists("schema.table_false")
False
or
spark.catalog._jcatalog.tableExists("schema.table")
True
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With