Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apache Spark startsWith in SQL expression

In Apache Spark API I can use startsWith function in order to test the value of the column:

myDataFrame.filter(col("columnName").startsWith("PREFIX"))

Is it possible to do the same in Spark SQL expression and if so, could you please show an example?.

like image 976
alexanoid Avatar asked Jan 03 '19 13:01

alexanoid


People also ask

How do you add leading zeros in Spark SQL?

Method 1 - Using Format String: Format String basically formats the given value in the printf-style and returns the resultant value as a string output. One can import the Spark SQL function format_string and use it to add the leading zeros or for zero padding to the column in Spark dataframe.

How do I get the length of a string in Spark SQL?

Spark SQL provides a length() function that takes the DataFrame column type as a parameter and returns the number of characters (including trailing spaces) in a string. This function can be used to filter() the DataFrame rows by the length of a column. If the input column is Binary, it returns the number of bytes.


3 Answers

With regexp_replace, you can get the same results without UDFs. Check this out

scala> val df=Seq(("bPREFIX",1),("PREFIXb",2)).toDF("a","b")
df: org.apache.spark.sql.DataFrame = [a: string, b: int]

scala> df.show
+-------+---+
|      a|  b|
+-------+---+
|bPREFIX|  1|
|PREFIXb|  2|
+-------+---+


scala> df.filter(regexp_replace('a,"""^PREFIX.*""","")==="").show
+-------+---+
|      a|  b|
+-------+---+
|PREFIXb|  2|
+-------+---+


scala>

or using regexp_extract()

scala> df.filter(regexp_extract('a,"""(^PREFIX)""",1)==="PREFIX").show
+-------+---+
|      a|  b|
+-------+---+
|PREFIXb|  2|
+-------+---+


scala>

Using instr() function

scala> df.filter("instr(a,'PREFIX')=1").show
+-------+---+
|      a|  b|
+-------+---+
|PREFIXb|  2|
+-------+---+


scala>
like image 96
stack0114106 Avatar answered Nov 06 '22 10:11

stack0114106


I found the following solution that works with Spark SQL query without custom UDP and out of the box, for example:

CASE WHEN instr(PRICE_SOURCE_INDICATOR,'MAC') = 1
like image 22
alexanoid Avatar answered Nov 06 '22 10:11

alexanoid


You can create an UDF to do that

import org.apache.spark.sql.functions
val startsWith = udf((columnValue: String) => columnValue.startsWith("PREFIX"))

The UDF will receive the column and check it against the PREFIX, then you can use it as follows:

myDataFrame.filter(startsWith($"columnName"))

If you want a parameter as prefix you can with lit

val startsWith = udf((columnValue: String, prefix:String) => columnValue.startsWith(prefix))
myDataFrame.filter(startsWith($"columnName", lit("PREFIX")))

INPUT EXAMPLE

+-----------+------+
|letter     |number|
+-----------+------+
|    PREFIXb|     1|
|    bPREFIX|     2|
+-----------+------+

OUTPUT EXAMPLE

myDataFrame.filter(startsWith($"letter")).show
+-----------+------+
|letter     |number|
+-----------+------+
|    PREFIXb|     1|
+-----------+------+
like image 38
SCouto Avatar answered Nov 06 '22 12:11

SCouto