Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert comma separated string to array in pyspark dataframe

I have a dataframe as below where ev is of type string.

>>> df2.show()
+---+--------------+
| id|            ev|
+---+--------------+
|  1| 200, 201, 202|
|  1|23, 24, 34, 45|
|  1|          null|
|  2|            32|
|  2|          null|
+---+--------------+

Is there a way to cast ev to type ArrayType without using UDF or UDF is the only option to do that?

like image 218
Swadeep Avatar asked Jul 04 '16 16:07

Swadeep


1 Answers

You can use built-in split function:

from pyspark.sql.functions import col, split

df = sc.parallelize([
    (1, "200, 201, 202"), (1, "23, 24, 34, 45"), (1, None),
    (2, "32"), (2, None)]).toDF(["id", "ev"])

df.select(col("id"), split(col("ev"), ",\s*").alias("ev"))

If you want to convert data to numeric types you can cast as follows:

df.withColumn(
    "ev",
    split(col("ev"), ",\s*").cast("array<int>").alias("ev")
)

or

from pyspark.sql.types import ArrayType, IntegerType

df.withColumn(
    "ev",
    split(col("ev"), ",\s*").cast(ArrayType(IntegerType())).alias("ev")
)
like image 77
zero323 Avatar answered Oct 22 '22 21:10

zero323