I'm trying to copy a BigQuery table (Table1) stored within a Google Cloud Project (Project1) to another Google Cloud Project (Project2). The table is on the order of TBs. What's the best way to do this so that I don't have to export the table locally? Should I export the table from Project1 to Google Cloud Storage, and then to Project2? Or is there a better way?
Use bq command line tool to copy a table from one project to another. You can have a look at the following sample command
Source:
Destination:
Command:
bq cp 123456789123:dataset1.table1 0987654321098:dataset2.table2
If Source and Destination in the same location - you can just use Copy Table even between different projects
If you are wondering to copy the dataset from one project to another project then you can use the below command to make the transfer job:
bq mk --transfer_config --project_id=[PROJECT_ID] --data_source=[DATA_SOURCE] --target_dataset=[DATASET] --display_name=[NAME] --params='[PARAMETERS]'
where
PROJECT_ID
: The destination project_ID
DATA_SOURCE
: cross_region_copy
DATASET
: Target dataset
NAME
: Display name of your job.
PARAMETERS
: Source project ID, Source Dataset ID and other parameteres can be defined( overwrite destination table etc.)
You can go through this link for detailed explanation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With