Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to replace nulls in Vector column?

I have a column of type [vector] and I have null values in it that I can't get rid of, here's an example

import org.apache.spark.mllib.linalg.Vectors

val sv1: Vector = Vectors.sparse(58, Array(8, 45), Array(1.0, 1.0))
val df_1 = sc.parallelize(List(("id_1", sv1))).toDF("id", "feature_vector")
val df_2 = sc.parallelize(List(("id_1", 10.0), ("id_2", 10.0))).toDF("id", "numeric_feature")

val df_joined = df_1.join(df_2, Seq("id"), "right")

df_joined.show()

+----+--------------------+---------------+
|  id|      feature_vector|numeric_feature|
+----+--------------------+---------------+
|id_1|(58,[8,45],[1.0,1...|           10.0|
|id_2|                null|           10.0|
+----+--------------------+---------------+

What i'd like to do:

val map = Map("feature_vector" -> sv1)
val result = df_joined.na.fill(map)

But that throws an error:

Message: Unsupported value type org.apache.spark.mllib.linalg.SparseVector ((58,[8,45],[1.0,1.0])).

Other things i've tried:

df_joined.withColumn("feature_vector", when(col("feature_vector").isNull, sv1).otherwise(sv1)).show

from how to filter out a null value from spark dataframe

I'm struggling to find a solution that would work on Spark 1.6

like image 213
Alexvonrass Avatar asked Feb 17 '26 12:02

Alexvonrass


1 Answers

Coalesce and join should do the trick

import org.apache.spark.sql.functions.{coalesce, broadcast}

val fill = Seq(
  Tuple1(Vectors.sparse(58, Array(8, 45), Array(1.0, 1.0)))
).toDF("fill")


df_joined
  .join(broadcast(fill))
  .withColumn("feature_vector", coalesce($"feature_vector", $"fill"))
  .drop("fill")
like image 71
Alper t. Turker Avatar answered Feb 20 '26 01:02

Alper t. Turker