We have an instance of Spark 1.2.0 that we can run the command show tables like 'tmp*';
, using beeline connected to the thrift server port, without issue. We are testing things out against Spark 1.4.0 on the same machine, but when we run the same command on Spark 1.4.0, we get the following error:
0: jdbc:hive2://localhost:10001> show tables like 'tmp*';
Error: java.lang.RuntimeException: [1.13] failure: ``in'' expected but identifier like found
show tables like 'tmp*'
^ (state=,code=0)
0: jdbc:hive2://localhost:10001>
I pulled down Spark 1.3.0 on this machine and it gives the same error as above when running show tables like 'tmp*'
.
Does anyone know if there is a similar command in Spark SQL 1.3.0+ that will allow the use of wild cards to return tables with a given pattern?
This was done on a machine running CDH 5.3.0. The Hive version is Hive 0.13.1-cdh5.3.0
if that matters.
You may use below command on Spark-SQL shell
sqlContext.tables().filter("tableName LIKE '%tmp%'").collect()
$ spark-shell
scala> sql("show tables like 'tmp*'").show()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With