I have around 108 tables in a dataset. I am trying to extract all those tables using the following bash script:
# get list of tables
tables=$(bq ls "$project:$dataset" | awk '{print $1}' | tail +3)
# extract into storage
for table in $tables
do
bq extract --destination_format "NEWLINE_DELIMITED_JSON" --compression "GZIP" "$project:$dataset.$table" "gs://$bucket/$dataset/$table.json.gz"
done
But it seems that bq ls
only show around 50 tables at once and as a result I can not extract them to cloud storage.
Is there anyway I can access all of the 108 tables using the bq ls
command?
The default number of rows when listing tables that bq ls
will display is 100. You can change this with the command line option --max_results
or -n
.
You can also set the default values for bq
in $HOME/.bigqueryrc
.
Adding flags to .bigqueryrc
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With