So I have a table with one column of map type (the key and value are both strings).
I'd like to write Spark SQL like this to check if given key exists in the map.
select count(*) from my_table where map_contains_key(map_column, "testKey")
How can I do this?
Such construction can be used:
df.where($"map_column"("testKey").isNotNull)
For pure sql:
spark.sql("select * from my_table where mapColumn[\"testKey\"] is not null")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With