Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark sql how to execute sql command in a loop for every record in input DataFrame

Spark sql how to execute sql command in a loop for every record in input DataFrame

I have a DataFrame with following schema

     %> input.printSchema
        root
        |-- _c0: string (nullable = true)
        |-- id: string (nullable = true)

I have another DataFrame on which I need to execute sql command

        val testtable = testDf.registerTempTable("mytable")

        %>testDf.printSchema
            root
            |-- _1: integer (nullable = true)

        sqlContext.sql(s"SELECT * from mytable WHERE _1=$id").show()

$id should be from the input DataFrame and the sql command should execute for all input table ids

like image 867
G G Avatar asked Apr 11 '26 08:04

G G


1 Answers

Assuming you can work with a single new DataFrame containing all the rows present in testDf that matches the values present in the id column of input, you can do an inner join operation, as stated by Alberto:

val result = input.join(testDf, input("id") == testDf("_1"))
result.show()

Now, if you want a new, different DataFrame for each distinct value present in testDf, the problem is considerably harder. If this is the case, I would suggest you to make sure the data in your lookup table can be collected as a local list, so you could loop through its values and create a new DataFrame for each one as you already thought (this is not recommended):

val localArray: Array[Int] = input.map { case Row(_, id: Integer) => id }.collect
val result: Array[DataFrame] = localArray.map {
  i => testDf.where(testDf("_1") === i)
}

Anyway, unless the lookup table is very small, I suggest that you adapt your logic to work with the single joined DataFrame of my first example.

like image 150
Daniel de Paula Avatar answered Apr 13 '26 00:04

Daniel de Paula



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!