Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to set a blob column in the where clause using spark-connector-api?

I am trying to figure out how to set a blob column under where clause. any idea?

For example if I put the following query in cqlsh it works

select * from hello where id=0xc1c1795a0b;

//id is a blob column in cassandra

I tried the following

JavaRDD<CassandraRow> cassandraRowsRDD = javaFunctions(sc).cassandraTable("test", "hello")
.select("range" )
.where("id=?", "0xc1c1795a0b");

This gave me a type converter exception

and I tried this

JavaRDD<CassandraRow> cassandraRowsRDD = javaFunctions(sc).cassandraTable("test", "hello")
.select("range" )
.where("id=?", "0xc1c1795a0b".getBytes());

This gave me no error however it returned no results. The query in my cqlsh did return bunch of results. so I am not sure to set a blob in the where clause. I am using Java. any ideas?

like image 751
user1870400 Avatar asked Oct 19 '22 00:10

user1870400


1 Answers

Use this.

import com.datastax.driver.core.utils.Bytes;

JavaRDD<CassandraRow> cassandraRowsRDD = javaFunctions(sc).cassandraTable("test", "hello")
.select("range" )
.where("id=?",Bytes.getArray(Bytes.fromHexString("0xc1c1795a0b")));
like image 152
abaghel Avatar answered Nov 01 '22 14:11

abaghel