Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# Decimal datatype performance

I'm writing a financial application in C# where performance (i.e. speed) is critical. Because it's a financial app I have to use the Decimal datatype intensively.

I've optimized the code as much as I could with the help of a profiler. Before using Decimal, everything was done with the Double datatype and the speed was several times faster. However, Double is not an option because of its binary nature, causing a lot of precision errors over the course of multiple operations.

Is there any decimal library that I can interface with C# that could give me a performance improvement over the native Decimal datatype in .NET?

Based on the answers I already got, I noticed I was not clear enough, so here are some additional details:

  • The app has to be as fast as it can possibly go (i.e. as fast as it was when using Double instead of Decimal would be a dream). Double was about 15x faster than Decimal, as the operations are hardware based.
  • The hardware is already top-notch (I'm running on a Dual Xenon Quad-Core) and the application uses threads, so CPU utilization is always 100% on the machine. Additionally, the app is running in 64bit mode, which gives it a mensurable performance advantage over 32bit.
  • I've optimized past the point of sanity (more than one month and a half optimizing; believe it or not, it now takes approx. 1/5000 of what it took to do the same calculations I used as a reference initially); this optimization involved everything: string processing, I/O, database access and indexes, memory, loops, changing the way some things were made, and even using "switch" over "if" everywhere it made a difference. The profiler is now clearly showing that the remaining performance culprit is on the Decimal datatype operators. Nothing else is adding up a considerable amount of time.
  • You have to believe me here: I've gone as far as I could possibly go in the realm of C#.NET to optimize the application, and I'm really amazed at its current performance. I'm now looking for a good idea in order to improve Decimal performance to something close to Double. I know it's only a dream, but just wanted to check I thought of everything possible. :)

Thanks!

like image 726
tempw Avatar asked Dec 14 '08 19:12

tempw


2 Answers

you can use the long datatype. Sure, you won't be able to store fractions in there, but if you code your app to store pennies instead of pounds, you'll be ok. Accuracy is 100% for long datatypes, and unless you're working with vast numbers (use a 64-bit long type) you'll be ok.

If you can't mandate storing pennies, then wrap an integer in a class and use that.

like image 62
gbjbaanb Avatar answered Sep 20 '22 12:09

gbjbaanb


You say it needs to be fast, but do you have concrete speed requirements? If not, you may well optimise past the point of sanity :)

As a friend sitting next to me has just suggested, can you upgrade your hardware instead? That's likely to be cheaper than rewriting code.

The most obvious option is to use integers instead of decimals - where one "unit" is something like "a thousandth of a cent" (or whatever you want - you get the idea). Whether that's feasible or not will depend on the operations you're performing on the decimal values to start with. You'll need to be very careful when handling this - it's easy to make mistakes (at least if you're like me).

Did the profiler show particular hotspots in your application that you could optimise individually? For instance, if you need to do a lot of calculations in one small area of code, you could convert from decimal to an integer format, do the calculations and then convert back. That could keep the API in terms of decimals for the bulk of the code, which may well make it easier to maintain. However, if you don't have pronounced hotspots, that may not be feasible.

+1 for profiling and telling us that speed is a definite requirement, btw :)

like image 20
Jon Skeet Avatar answered Sep 17 '22 12:09

Jon Skeet