I am storing cryptocurrency transaction balances in NUMERIC
column. As cryptocurrency values vary in somewhat extreme ranges compared to traditional currency, I am using NUMERIC(60,20)
type to capture all uses cases. As this feel little bit extreme, I am curious
What are the performance (CPU) penalties of increasing NUMERIC column precision and scale
What are the storage cost penalties of increasing NUMERIC column precision and scale
The declared scale and precision of a NUMERIC
column serve as a constraint on your inserted values, but they do not directly affect storage requirements; 1::NUMERIC(1,0)
, 1::NUMERIC(99,98)
and 1::NUMERIC
all have identical underlying representations. With this in mind, your best option might be to use an unconstrained NUMERIC
column, and cast your values to a suitable scale/precision on a currency-by-currency basis.
NUMERIC
values are stored as variable-length arrays of base-10,000 digits, each represented by a 16-bit integer, so the storage cost is 2 bytes per 4 decimal digits, plus a 6-byte header for each value. The fractional and integer portions are stored separately, so 11
consumes 8 bytes, while 1.1
requires 10. You can check the storage requirements for a given value using e.g. SELECT pg_column_size(1.1::NUMERIC)
.
As for the CPU overhead, I would expect that the cost of most operations scales linearly with the number of digits. However, this is generally dwarfed by the I/O cost of fetching the value in the first place, so it's likely a non-issue. You'd have to try your queries on your own hardware to know for sure.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With