Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it more efficient to use hexadecimal instead of decimal?

Hello i am using visual studio 2015 with .net framework 4.5 if it matters and my Resharper keeps suggesting me to switch from decimal numbers to hex. Why is that ? Is there any performance bonus if im using hex ?

like image 812
KOPEUE Avatar asked Dec 11 '22 18:12

KOPEUE


1 Answers

There is absolutely no performance difference between the format of numeric literals in a source language, because the conversion is done by the compiler. The only reason to switch from one representation to another is readability of your code.

Two common cases for using hexadecimal literals are representing colors and bit masks. Since color representation is often split at byte boundaries, parsing a number 0xFF00FF is much easier than 16711935: hex format tells you that the red and blue components are maxed out, while the green component is zero. Decimal format, on the other hand, requires you to perform the conversion.

Bit masks are similar: when you use hex or octal representation, it is very easy to see what bits are ones and what bits are zero. All you need to learn is a short table of sixteen bit patterns corresponding to hex digits 0 through F. You can immediately tell that 0xFF00 has the upper eight bits set to 1, and the lower eight bits set to 0. Doing the same with 65280 is much harder for most programmers.

like image 104
Sergey Kalinichenko Avatar answered Jan 04 '23 17:01

Sergey Kalinichenko