I'm new to sparklyr (but familiar with spark and pyspark), and I've got a really basic question. I'm trying to filter a column based on a partial match. In dplyr, i'd write my operation as so:
businesses %>%
filter(grepl('test', biz_name)) %>%
head
Running that code on a spark dataframe however gives me:
Error: org.apache.spark.sql.AnalysisException: Undefined function: 'GREPL'. This function is neither a registered temporary function nor a permanent function registered in the database 'project_eftpos_failure'.; line 5 pos 7
The same as in standard Spark, you can use either rlike
(Java regular expressions):
df <- copy_to(sc, iris)
df %>% filter(rlike(Species, "osa"))
# or anchored
df %>% filter(rlike(Species, "^.*osa.*$"))
or like
(simple SQL regular expressions):
df %>% filter(like(Species, "%osa%"))
Both methods can be also used with suffix notation as
df %>% filter(Species %rlike% "^.*osa.*$")
and
df %>% filter(Species %like% "%osa%")
respectively.
For details see vignette("sql-translation")
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With