Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to check if key exists in spark sql map type

So I have a table with one column of map type (the key and value are both strings).

I'd like to write Spark SQL like this to check if given key exists in the map.

select count(*) from my_table where map_contains_key(map_column, "testKey")

How can I do this?

like image 646
seiya Avatar asked Dec 13 '22 14:12

seiya


1 Answers

Such construction can be used:

df.where($"map_column"("testKey").isNotNull)

For pure sql:

spark.sql("select * from my_table where mapColumn[\"testKey\"] is not null")
like image 94
pasha701 Avatar answered Jan 15 '23 22:01

pasha701