How can i filter only not empty arrays
import org.apache.spark.sql.types.ArrayType
val arrayFields = secondDF.schema.filter(st => st.dataType.isInstanceOf[ArrayType])
val names = arrayFields.map(_.name)
Or is this code
val DF1=DF.select(col("key"),explode(col("objectiveAttachment")).as("collection")).select(col("collection.*"),col("key"))
|-- objectiveAttachment: array (nullable = true)
| |-- element: string (containsNull = true)
I get this error
org.apache.spark.sql.AnalysisException: Can only star expand struct data types. Attribute: ArrayBuffer(collection);
Any help is appreciated.
In Spark, using filter() or where() functions of DataFrame we can filter rows with NULL values by checking IS NULL or isNULL . These removes all rows with null values on state column and returns the new DataFrame.
Solution: In order to find non-null values of PySpark DataFrame columns, we need to use negate of isNotNull() function for example ~df. name. isNotNull() similarly for non-nan values ~isnan(df.name) .
The Spark where() function is defined to filter rows from the DataFrame or the Dataset based on the given one or multiple conditions or SQL expression. The where() operator can be used instead of the filter when the user has the SQL background. Both the where() and filter() functions operate precisely the same.
The isNotNull method returns true if the column does not contain a null value, and false otherwise. The isin method returns true if the column is contained in a list of arguments and false otherwise. You will use the isNull , isNotNull , and isin methods constantly when writing Spark code.
Use the function size
import org.apache.spark.sql.functions._
secondDF.filter(size($"objectiveAttachment") > 0)
Try with size() function from org.apache.spark.sql.functions._
import org.apache.spark.sql.functions._
val df1=df.select(col("key"),explode(col("objectiveAttachment")).as("collection")).select(col("collection.*"),col("ins"))
.filter(size($"objectiveAttachment")>0)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With