I understand that Perl 6 implements decimals as rationals wherever possible in order to avoid floating point issues that are present in most other languages.
Has anybody done benchmarking or have an understanding of the performance penalty of doing this?
Does Perl 6 performance suffer by using rationals for decimal numbers
I think the most useful overall answer is "no, not really, but let me elaborate a bit".
If you'd like P6 to just automatically keep decimal calculations reasonably accurate and fast, then just use Rat
s.
If the denominator of the overall result of computing an expression involving one or more P6 Rat
s overflows 64 bit then P6 will automatically switch the result to an approximate Num
(P6's implementation of a double float) to maintain reasonable performance.
You can directly use a floating point data type for decimals.
This yields floating point performance which is generally faster than rational performance.
For example, 3.141592e0
is not a rational/decimal literal but instead a Num
/float literal.
If you use one of the built in rational data types (the limited precision Rat
class; the arbitrary precision FatRat
class; or the underlying parameterizable precision Rational
role) then numeric operations on that data will be 100% accurate if the calculation fits in the denominator.
Read the wikipedia page on rational data types for a starting point on understanding the performance implications of using a rational type.
Has anybody done benchmarking or have an understanding of the performance penalty of doing this?
Larry Wall made the design decisions that led to decimals being, by default, processed as rationals. He has discussed the performance implications on #perl6. Some useful searches:
'decimal'
'rational'
'float'
'denominator'
If I find links to useful benchmarks I'll add them later.
One specific issue I ran into with rationals in the current version of Perl 6 (Rakudo Star 2017.07) was that there seems to be a performance issue related to converting them to a string.
I noticed this when debugging a script that takes a CSV file and outputs a graph in SVG format, which of course has lots of coordinates. It was slow, and when I got around to profiling it I saw that my script spent an inordinate amount of time in src/core/Rational.Str, and the time spent was not proportional to the number of numbers outputted (nearly 6000ms for nearly 11000 rats). Simply converting them to floating point numbers with the Num method immediately before outputting them resulted in a significant speedup (though this also gave a different, but similar, result).
I don't know if this is a general problem or specific to my script, it is obviously a fixable implementation shortcoming, but something to be aware of until then if you output a lot of numbers.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With