Is there a better method for dealing with large integers than casting them as strings when querying data from BigQuery through R via the API?
Here's an MVE showing the problem with the integer appearing as "NA":
> library(bigrquery)
>
> bq_str <- "
+ SELECT
+ 206932402797274623 AS big_pk
+ ,SAFE_CAST(206932402797274623 AS string) AS string_pk
+ "
>
> my_df <- bigrquery::query_exec(query = bq_str,
+ project = 'XXXXXXXXXXX',
+ use_legacy_sql = FALSE,
+ bigint = "integer64")
0 bytes processed
Warning message:
In converter[[type]](data_m[i, ]) :
NAs introduced by coercion to integer range
> head(my_df)
big_pk string_pk
1 NA 206932402797274623
Here's the code:
library(bigrquery)
bq_str <- "
SELECT
206932402797274623 AS big_pk
,SAFE_CAST(206932402797274623 AS string) AS string_pk
"
my_df <- bigrquery::query_exec(query = bq_str,
project = 'XXXXXX',
use_legacy_sql = FALSE,
bigint = "integer64")
head(my_df)
I am using version 1.1.1 of bigrquery
.
BigQuery Data Types: NUMERIC Numeric: There is a data type called 'NUMERIC' which is similar to 'Decimal' which can store values with 38 decimal digits of precision and nine decimal digits of scale. Suitable for exact calculations.
BigQuery supports casting to NUMERIC.
BigQuery is a fully managed enterprise data warehouse that helps you manage and analyze your data with built-in features like machine learning, geospatial analysis, and business intelligence.
Regarding query_exec
, this has been deprecated, try using bq_query
.
If you are only looking forward to avoid casting to string I recommend to cast to numeric.
Otherwise, you can also use bq_table_download
keep in mind to mapped bigint to "integer64".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With