Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to escape column names with hyphen in Spark SQL

Tags:

I have imported a json file in Spark and convertd it into a table as

myDF.registerTempTable("myDF") 

I then want to run SQL queries on this resulting table

val newTable = sqlContext.sql("select column-1 from myDF") 

However this gives me an error because of the hypen in the name of the column column-1. How do I resolve this is Spark SQL?

like image 311
sfactor Avatar asked Jun 17 '15 11:06

sfactor


People also ask

How do you escape special characters in spark SQL?

Use ` to escape special characters (e.g., ` ).

How do I escape a column name in SQL?

Quotation Mark " The SQL:1999 standard specifies that double quote (") (QUOTATION MARK) is used to delimit identifiers. Oracle, PostgreSQL, MySQL, MSSQL and SQlite all support " as the identifier delimiter.

How do I change special characters in spark SQL?

By using regexp_replace() Spark function you can replace a column's string value with another string/substring. regexp_replace() uses Java regex for matching, if the regex does not match it returns an empty string. The below example replaces the street name Rd value with Road string on address column.

How do I escape a single quote in spark SQL?

generally when u deal with apostrophe u replace the the single quote(') with ('').


1 Answers

Backticks (`) appear to work, so

val newTable = sqlContext.sql("select `column-1` from myDF") 

should do the trick, at least in Spark v1.3.x.

like image 99
PermaFrost Avatar answered Sep 22 '22 12:09

PermaFrost