Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

BigQuery export table to csv file

I'm trying to export a BigQuery form UI to Google Storage table but facing this error:

Errors: Table gs://mybucket/delta.csv.gz too large to be exported to a single file. Specify a uri including a to shard export. (error code: invalid)

When trying to export after query I got:

Download Unavailable This result set contains too many rows to download. Please use "Save as Table" and then export the resulting table.

like image 721
Ali SAID OMAR Avatar asked Apr 07 '16 07:04

Ali SAID OMAR


People also ask

How can I export more than 16000 rows in BigQuery?

If your data has more than 16,000 rows you'd need to save the result of your query as a BigQuery Table. Afterwards, export the data from the table into Google Cloud Storage using any of the available options (such as the Cloud Console, API, bq or client libraries).


2 Answers

Finally found how to do. we must use "*" in the blob name.

enter image description here

And will create as many file as needed.

enter image description here

It's weird that i can import large file (~GB) but not possible to export large file :(

like image 124
Ali SAID OMAR Avatar answered Dec 03 '22 21:12

Ali SAID OMAR


BigQuery can export up to 1 GB of data per file
For larger than 1GB - BigQuery supports exporting to multiple files

See Single wildcard URI and Multiple wildcard URIs in Exporting data into one or more files

like image 39
Mikhail Berlyant Avatar answered Dec 03 '22 22:12

Mikhail Berlyant