Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

List all the tables in a dataset in bigquery using bq CLI and store them to google cloud storage

I have around 108 tables in a dataset. I am trying to extract all those tables using the following bash script:

# get list of tables
tables=$(bq ls "$project:$dataset" | awk '{print $1}' | tail +3)

# extract into storage
for table in $tables
do
    bq extract --destination_format "NEWLINE_DELIMITED_JSON" --compression "GZIP" "$project:$dataset.$table" "gs://$bucket/$dataset/$table.json.gz" 
done

But it seems that bq ls only show around 50 tables at once and as a result I can not extract them to cloud storage.

Is there anyway I can access all of the 108 tables using the bq ls command?

like image 911
Syed Arefinul Haque Avatar asked Dec 08 '22 13:12

Syed Arefinul Haque


1 Answers

The default number of rows when listing tables that bq ls will display is 100. You can change this with the command line option --max_results or -n.

You can also set the default values for bq in $HOME/.bigqueryrc.

Adding flags to .bigqueryrc

like image 79
John Hanley Avatar answered Dec 26 '22 12:12

John Hanley