I'm recently getting the following BigQuery error when using the Python API:
google.api_core.exceptions.BadRequest: 400 configuration.query.destinationTable cannot be set for scripts
This is the function I use:
def execute_bigquery_sql(query, dataset_id, table_id, use_legacy_sql=True, write_disposition='WRITE_TRUNCATE'):
    client = bigquery.Client()
    job_config = bigquery.QueryJobConfig()
    job_config.use_legacy_sql = use_legacy_sql
    print("table_id: {table_id}".format(table_id=table_id))
    print("dataset_id: {dataset_id}".format(dataset_id=dataset_id))
    if table_id:
        table_ref = client.dataset(dataset_id).table(table_id)
        print("table_ref: {table_ref}".format(table_ref=table_ref))
        job_config.destination = table_ref
        job_config.write_disposition = write_disposition
        job_config.allow_large_results = True
        job_config.createDisposition = "CREATE_IF_NEEDED"
    query_job = client.query(query,job_config=job_config)
    results = query_job.result()  # Waits for job to complete.
Does anyone knows what might be happening and a workaround?
Thanks for the responses, comments were indeed in the right direction. In BigQuery scripting means https://cloud.google.com/bigquery/docs/reference/standard-sql/scripting.
And that is what is not allowed, and my query had:
DECLARE capital int64 default 10000000;
So, removing the line above was the fix in my case.
Interesting thing is that even in the web interface

error is self-descriptive. scripts do not allow destination table to be set - instead you should use DML/DDL
workaround is to reset job_config with no destination table in it
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With