Google says BigQuery can handle billions of rows.
For my application I estimate a usage of 200,000,000 * 1000 rows. Well over a few billion.
I can partition data into 200,000,000 rows per partition but the only support for this in BigQuery seems to be different tables. (please correct me if I am wrong)
The total data size will be around 2TB.
I saw in the examples some large data sizes, but the rows were all under a billion.
Can BigQuery support the number of rows I am dealing with in a single table?
If not, can I partition it in any way besides multiple tables?
Below should answer your question
I run it agains one of our dataset
As you can see tables size close to 10TB with around 1.3-1.6 Billion rows
SELECT
ROUND(size_bytes/1024/1024/1024/1024) as TB,
row_count as ROWS
FROM [mydataset.__TABLES__]
ORDER BY row_count DESC
LIMIT 10
I think the max table we dealt so far was at least up to 5-6 Billion and all worked as expected
Row TB ROWS
1 10.0 1582903965
2 11.0 1552433513
3 10.0 1526783717
4 9.0 1415777124
5 10.0 1412000551
6 10.0 1410253780
7 11.0 1398147645
8 11.0 1382021285
9 11.0 1378284566
10 11.0 1369109770
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With