Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to check if a string column in pyspark dataframe is all numeric

I have a PySpark Dataframe with a column of strings. How can I check which rows in it are Numeric. I could not find any function in PySpark's official documentation.

values = [('25q36',),('75647',),('13864',),('8758K',),('07645',)]
df = sqlContext.createDataFrame(values,['ID',])
df.show()
+-----+
|   ID|
+-----+
|25q36|
|75647|
|13864|
|8758K|
|07645|
+-----+

In Python, there is a function .isDigit() which returns True or False if the string contains just numbers or not.

Expected DataFrame:

+-----+-------+
|   ID| Value |
+-----+-------+
|25q36| False |
|75647| True  |
|13864| True  |
|8758K| False |
|07645| True  |
+-----+-------+

I would like to avoid creating a UDF.

like image 474
cph_sto Avatar asked Dec 12 '18 13:12

cph_sto


1 Answers

Filtering with Regex

Indeed I enjoyed the creative solution provided by Steven but here is my much easier suggestion for this kind of situation:

df.filter(~df.ID.rlike('\D+')).show()

Firstly, you select every row which contains a non-digits character with rlike('\D+') and then excluding those rows with ~ at the beginning of the filter.

like image 195
Mohammad Reza Malekpour Avatar answered Sep 22 '22 05:09

Mohammad Reza Malekpour