Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

BigQuery "copy table" not working for small tables

I am trying to copy a BigQuery table using the API from one table to the other in the same dataset. While copying big tables seems to work just fine, copying small tables with a limited number of rows (1-10) I noticed that the destination table comes out empty (created but 0 rows). I get the same results using the API and the BigQuery management console.

The issue is replicated for any table in any dataset I have. Looks like a bug or a designed behavior.

Could not find any "minimum lines" directive in the docs.. am I missing something?

EDIT: Screenshots

Original table: video_content_events with 2 rows

Copy table: copy111 with 0 rows

like image 769
Idan Avatar asked Feb 08 '16 14:02

Idan


People also ask

How can I copy data from one table to another table in BigQuery?

In the Google Cloud console, go to the BigQuery page. Click Data transfers. Select a transfer for which you want to view the transfer details. On the Transfer details page, select a transfer run.

How do I copy a table in BigQuery?

In the BigQuery UI, select the table you wish to copy, then push the Copy Table button. Enter the desired new table name. BigQuery documentation lists additional methods for copying a table (via API, with Python, PHP, etc).

How do I create a new table from an existing table in BigQuery?

In the Google Cloud console, go to the BigQuery page. In the Explorer pane, expand your project, and then select a dataset. In the Dataset info section, click add_box Create table.


1 Answers

How are you populating the small tables? Are you perchance using streaming insert (bq insert from the command line tool, tabledata.insertAll method)? If so, per the documentation, data can take up to 90 minutes to be copyable/exportable:

https://cloud.google.com/bigquery/streaming-data-into-bigquery#dataavailability

I won't get super detailed, but the reason is that our copy and export operations are optimized to work on materialized files. Data within our streaming buffers are stored in a completely different system, and thus aren't picked up until the buffers are flushed into the traditional storage mechanism. That said, we are working on removing the copy/export delay.

If you aren't using streaming insert to populate the table, then definitely contact support/file a bug here.

like image 157
Sean Chen Avatar answered Sep 29 '22 02:09

Sean Chen