Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hex 0x0001 vs 0x00000001

Tags:

decimal

hex

often in code that uses permissions checking, i see some folks use hex 0x0001 and others use 0x00000001. these both look like an equivalent of a decimal 1, if i'm not mistaking.

why use one over the other, just a matter of preference?

like image 554
randombits Avatar asked Nov 14 '09 22:11

randombits


People also ask

What is 0x00000001?

The 8 digit hexadecimal number 0x00000001 translates to the error APC_INDEX_MISMATCH which may be caused by the following: A kernel-level application or device driver running in kernel mode created a mismatch of thread and asynchronous procedure calls (APC) indexes.

What is the value of 0x0001?

D=13, E=14, and F=15. Like we said above, the 0X indicates that it is a hex number and that is the only purpose of 0X. Thus, to convert a hex number such as 0X0001 to decimal, we only need to look at the symbols after 0X which are 0001.

What does 0x1 mean?

0x1 is just hexadecimal for 1. After compiling there is no difference. 0x0 would be eqivalent to 0. The return value of the main function is normally not used by the system.


2 Answers

Assuming that this is C, C++, Java, C# or something similar, they are the same. 0x0001 implies a 16-bit value while 0x00000001 implies a 32-bit value, but the real word length is determined by the compiler at compile time when evaluating hexadecimal literals such as these. This is a question of coding style, but it doesn't make any difference in the compiled code.

like image 186
Tamas Czinege Avatar answered Oct 08 '22 18:10

Tamas Czinege


What's going on here is this is a bitmask for which it is tradition to place leading zeros out to the width of the bitmask. I would furthermore guess the width of the bitmask changed at some point to add more specialized permissions.

like image 25
Joshua Avatar answered Oct 08 '22 17:10

Joshua