If I already have the schema file, for example: schema.json. How can I load the file to create the table or job schema using the google-cloud-python API?
With batch loading, you load the source data into a BigQuery table in a single batch operation. For example, the data source could be a CSV file, an external database, or a set of log files.
You can try this solution:
import json
from google.cloud import bigquery
bigquerySchema = []
with open('schema.json') as f:
bigqueryColumns = json.load(f)
for col in bigqueryColumns:
bigquerySchema.append(bigquery.SchemaField(col['name'], col['type']))
bigqueryClient = bigquery.Client()
tableRef = "myproject.mydataset.mytable"
table = bigquery.Table(tableRef, schema=bigquerySchema)
table = bigqueryClient.create_table(table)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With