Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

BigQuery API limit exceeded error

We've got error during tabledata.list with message:

API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.

It is not listed at https://cloud.google.com/bigquery/troubleshooting-errors#errortable .

This error occurs every time.

We can export this table into GCS normally. Result looks normal (there are no extremely large rows).

We manage to retrieve several result pages before the actual error occurs.

com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
    "reason" : "apiLimitExceeded"
  } ],
  "message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table."
}
    at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
    at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
    at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]

What does it mean? How can we resolve this error?

like image 810
Pavel Ajtkulov Avatar asked May 31 '16 13:05

Pavel Ajtkulov


People also ask

What is the maximum column limit for the BigQuery?

A table, query result, or view definition can have up to 10,000 columns. The total query result sizes for a project is 1 TB per day. For more information, see BigQuery Omni limitations. With on-demand pricing, your project can have up to 2,000 concurrent slots.

How do you handle errors in BigQuery?

BigQuery treats ERROR in the same way as any expression that may result in an error: there is no special guarantee of evaluation order. BigQuery infers the return type in context. In the following example, the query returns an error message if the value of the row does not match one of two defined values.

How much data can BigQuery handle?

Limits when using a BigQuery data source Maximum of 100,000 rows - BigQuery datasets exceeding the maximum number of rows will be truncated. Partitioned tables are not supported - Any BigQuery tables that have partitioned columns will report an error during configuration in AppSheet.


2 Answers

Sorry about the inconvenience.

This is a known issue of tabledata.list method. The problem is that we have some infrastructure limitations that it is currently not possible to return very large row from tabledata.list.

large is a relative word. Unfortunately some row has small size when represented in json, but can consume lots of memory when represented in our internal format.

The current workaround is as mentioned in the error message: to export the table.

For the long-term, we are actively working on improving our system to overcome this limitation. Stay tuned :)

like image 185
Cheng Miezianko Avatar answered Oct 03 '22 14:10

Cheng Miezianko


There are two row related limits for tabledata.list:

  1. The byte size of the proto message has to be less than 10M. If one row is larger than this size, we cannot retrieve it.

  2. The max field value in the row has to be less than 350,000, that is the number of leaf fields of a row.

If you hit that problem, it usually only means the first row in your request is too large to return, if you skip that row, the following row retrieval might work. You may try look at the particular row closer to see why.

In the future, it is likely the field value limit can be removed, but we will still have the 10M size limit due to API server limitations.

like image 20
yiru Avatar answered Oct 03 '22 14:10

yiru