Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Performance of NUMERIC type with high precisions and scales in PostgreSQL

Tags:

postgresql

I am storing cryptocurrency transaction balances in NUMERIC column. As cryptocurrency values vary in somewhat extreme ranges compared to traditional currency, I am using NUMERIC(60,20) type to capture all uses cases. As this feel little bit extreme, I am curious

  • What are the performance (CPU) penalties of increasing NUMERIC column precision and scale

  • What are the storage cost penalties of increasing NUMERIC column precision and scale

like image 984
Mikko Ohtamaa Avatar asked Dec 05 '22 16:12

Mikko Ohtamaa


1 Answers

The declared scale and precision of a NUMERIC column serve as a constraint on your inserted values, but they do not directly affect storage requirements; 1::NUMERIC(1,0), 1::NUMERIC(99,98) and 1::NUMERIC all have identical underlying representations. With this in mind, your best option might be to use an unconstrained NUMERIC column, and cast your values to a suitable scale/precision on a currency-by-currency basis.

NUMERIC values are stored as variable-length arrays of base-10,000 digits, each represented by a 16-bit integer, so the storage cost is 2 bytes per 4 decimal digits, plus a 6-byte header for each value. The fractional and integer portions are stored separately, so 11 consumes 8 bytes, while 1.1 requires 10. You can check the storage requirements for a given value using e.g. SELECT pg_column_size(1.1::NUMERIC).

As for the CPU overhead, I would expect that the cost of most operations scales linearly with the number of digits. However, this is generally dwarfed by the I/O cost of fetching the value in the first place, so it's likely a non-issue. You'd have to try your queries on your own hardware to know for sure.

like image 70
Nick Barnes Avatar answered May 21 '23 22:05

Nick Barnes