Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Test big endian [duplicate]

Tags:

c

gcc

endianness

Possible Duplicate:
Little vs Big Endianess: How to interpret the test

Is there an easy method to test code with gcc or any online compiler like ideone for big endian? I don't want to use qemu or virtual machines

EDIT

Can someone explain the behavior of this piece of code on a system using big endian?

#include <stdio.h>
#include <string.h>
#include <stdint.h>

int main (void)
{
    int32_t i;
    unsigned char u[4] = {'a', 'b', 'c', 'd'};

    memcpy(&i, u, sizeof(u));
    printf("%d\n", i);
    memcpy(u, &i, sizeof(i));
    for (i = 0; i < 4; i++) {
        printf("%c", u[i]);
    }
    printf("\n");
    return 0;
}
like image 257
David Ranieri Avatar asked Feb 02 '13 09:02

David Ranieri


People also ask

How do you check if a system is big endian or little endian?

In little endian machines, last byte of binary representation of the multibyte data-type is stored first. On the other hand, in big endian machines, first byte of binary representation of the multibyte data-type is stored first.

Does Windows use little or big endian?

The following platforms are considered little endian: AXP/VMS, Digital UNIX, Intel ABI, OS/2, VAX/VMS, and Windows. On big endian platforms, the value 1 is stored in binary and is represented here in hexadecimal notation.

How do you find the endianness of a machine?

Now if you take a pointer c of type char and assign x 's address to c by casting x to char pointer, then on little endian architecture you will get 0x10 when *c is printed and on big endian architecture you will get 0x76 while printing down *c . Thereby you can find out the endianness for machine.

Is Linux Little Endian?

Although Power already has Linux distributions and supporting applications that run in big endian mode, the Linux application ecosystem for x86 platforms is much larger and Linux on x86 uses little endian mode.


1 Answers

As a program?

#include <stdio.h>
#include <stdint.h>

int main(int argc, char** argv) {
    union {
       uint32_t word;
       uint8_t bytes[4];
    } test_struct;
    test_struct.word = 0x1;
    if (test_struct.bytes[0] != 0)
        printf("little-endian\n");
    else
        printf("big-endian\n");
    return 0;
}

On a little-endian architecture, the least significant byte is stored first. On a big-endian architecture, the most-significant byte is stored first. So by overlaying a uint32_t with a uint8_t[4], I can check to see which byte comes first. See: http://en.wikipedia.org/wiki/Big_endian

GCC in particular defines the __BYTE_ORDER__ macro as an extension. You can test against __ORDER_BIG_ENDIAN__, __ORDER_LITTLE_ENDIAN__, and __ORDER_PDP_ENDIAN__ (which I didn't know existed!) -- see http://gcc.gnu.org/onlinedocs/cpp/Common-Predefined-Macros.html

See also http://en.wikipedia.org/wiki/Big_endian


As for running code in an endianness that doesn't match your machine's native endianness, then you're going to have to compile and run it on an architecture that has that different endianness. So you are going to need to cross-compile, and run on an emulator or virtual machine.


edit: ah, I didn't see the first printf().

The first printf will print "1633837924", since a big-endian machine will interpret the 'a' character as the most significant byte in the int.

The second printf will just print "abcd", since the value of u has been copied byte-by-byte back and forth from i.

like image 149
sheu Avatar answered Sep 28 '22 05:09

sheu