Let's assume I have an array like
char foo[] = { 0, 1, 1, 0 };
In gdb
, on an x86 machine, if I say
p (short[2])*foo
I get
{256, 1}
this is, two bytes are interpreted as a short
in little endian order.
Is there a convenient way (e.g. a macro) to make gdb
display a bytearray as big endian shorts (or whatever type) instead?
Use set endian big
. Use set endian auto
to switch back to automatic endianess selection.
(gdb) p (short[2])*foo
$1 = {256, 1}
(gdb) set endian big
The target is assumed to be big endian
(gdb) p (short[2])*foo
$2 = {1, 256}
(gdb) set endian auto
The target endianness is set automatically (currently little endian)
(gdb) p (short[2])*foo
$3 = {256, 1}
(gdb)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With