Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

BigQuery: TABLE_QUERY but columns differ between tables

We are loading csv files into BigQuery. Each file will create a separate table.

When we select from these tables, we do this mostly with a table query liek so:

SELECT foo, bar
FROM TABLE_QUERY(name_stub,'table_id CONTAINS "_something" and msec_to_timestamp(creation_time) > date_add(current_timestamp(), -90, "day")'));

Now we have added new fields to the newer files. So instead of just having fields "foo" and "bar" we now additionally have "baz".

When I run the following query, I get the error, that the field "baz" does not exists on one of the older tables.

SELECT foo, bar, baz
FROM TABLE_QUERY(name_stub,'table_id CONTAINS "_something" and msec_to_timestamp(creation_time) > date_add(current_timestamp(), -90, "day")'));

Is there a way to select "baz" and just have a default value for tables that don't have the column?

like image 459
drunken_monkey Avatar asked Sep 28 '22 01:09

drunken_monkey


1 Answers

Adding new columns to a table is possible (all historic data will automatically have NULLs), but you are creating new table every day - and TABLE_QUERY is just a shortcut syntax for UNION. The only idea I have is to create a view which will add column "baz" to the old tables. Then you should be able to use TABLE_QUERY across such views and newer tables.

like image 91
Mosha Pasumansky Avatar answered Oct 05 '22 06:10

Mosha Pasumansky