We've got error during tabledata.list with message:
API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.
It is not listed at https://cloud.google.com/bigquery/troubleshooting-errors#errortable .
This error occurs every time.
We can export this table into GCS normally. Result looks normal (there are no extremely large rows).
We manage to retrieve several result pages before the actual error occurs.
com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"reason" : "apiLimitExceeded"
} ],
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table."
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321) ~[com.google.api-client.google-api-client-1.21.0.jar:1.21.0]
What does it mean? How can we resolve this error?
A table, query result, or view definition can have up to 10,000 columns. The total query result sizes for a project is 1 TB per day. For more information, see BigQuery Omni limitations. With on-demand pricing, your project can have up to 2,000 concurrent slots.
BigQuery treats ERROR in the same way as any expression that may result in an error: there is no special guarantee of evaluation order. BigQuery infers the return type in context. In the following example, the query returns an error message if the value of the row does not match one of two defined values.
Limits when using a BigQuery data source Maximum of 100,000 rows - BigQuery datasets exceeding the maximum number of rows will be truncated. Partitioned tables are not supported - Any BigQuery tables that have partitioned columns will report an error during configuration in AppSheet.
Sorry about the inconvenience.
This is a known issue of tabledata.list method. The problem is that we have some infrastructure limitations that it is currently not possible to return very large row from tabledata.list.
large is a relative word. Unfortunately some row has small size when represented in json, but can consume lots of memory when represented in our internal format.
The current workaround is as mentioned in the error message: to export the table.
For the long-term, we are actively working on improving our system to overcome this limitation. Stay tuned :)
There are two row related limits for tabledata.list:
The byte size of the proto message has to be less than 10M. If one row is larger than this size, we cannot retrieve it.
The max field value in the row has to be less than 350,000, that is the number of leaf fields of a row.
If you hit that problem, it usually only means the first row in your request is too large to return, if you skip that row, the following row retrieval might work. You may try look at the particular row closer to see why.
In the future, it is likely the field value limit can be removed, but we will still have the 10M size limit due to API server limitations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With