Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

BigQuery maximum row size

Recently we've started to get errors about "Row larger than the maximum allowed size".

Although documentation states that limitation in 2MB from JSON, we have successfully loaded 4MB (and larger) records also (see job job_Xr8vR3Fyp6rlH4zYaZFbZSyQsyI for an example of a 4.6MB record).

Has there been any change in the maximum allowed row size? Erroneous job is job_qt_sCwokO2PWKNZsGNx6mK3cCWs. Unfortunately the error messages produced doesn't specify what record(s) is the problematic one.

like image 215
Lior Avatar asked Dec 10 '25 00:12

Lior


1 Answers

There hasn't been a change in the maximum row size (I double checked and went back through change lists and didn't see anything that could affect this). The maximum is computed from the encoded row, rather than the raw row, which is why you sometimes can get larger rows than the specified maximum into the system.

From looking at your failed job in the logs, it looks like the error was on line 1. Did that information not get returned in the job errors? Or is that line not the offending one?

It did look like there was a repeated field with a lot of entries that looked like "Person..durable".

Please let me know if you think that you received this in error or what we can do to make the error messages better.

like image 173
Jordan Tigani Avatar answered Dec 13 '25 17:12

Jordan Tigani



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!