Is there a way to check if a dataframe exists in pySpark?
I know in native python, to check if dataframe exists:
exists(df_name) && is.data.frame(get(df_name))
How can this be done in pySpark? Since command exists
throws an error.
It is same as Petel code. You can import the dataframe type.
from pyspark.sql import DataFrame
df= sc.parallelize([
(1,2,3), (4,5,7)]).toDF(["a", "b", "c"])
if df is not None and isinstance(df,DataFrame):
#<some operation>
print("dataframe exists")
try this: df_name is not None and isinstance(df_name, DataFrame)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With