Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why subtract null pointer in offsetof()?

Linux's stddef.h defines offsetof() as:

#define offsetof(TYPE, MEMBER) ((size_t) &((TYPE *)0)->MEMBER)

whereas the Wikipedia article on offsetof() (http://en.wikipedia.org/wiki/Offsetof) defines it as:

#define offsetof(st, m) \
    ((size_t) ( (char *)&((st *)(0))->m - (char *)0 ))

Why subtract (char *)0 in the Wikipedia version? Is there any case where that would actually make a difference?

like image 765
Bruce Christensen Avatar asked Apr 02 '10 19:04

Bruce Christensen


2 Answers

The first version converts a pointer into an integer with a cast, which is not portable.

The second version is more portable across a wider variety of compilers, because it relies on pointer arithmetic by the compiler to get an integer result instead of a typecast.

BTW, I was the editor that added the original code to the Wiki entry, which was the Linux form. Later editors changed it to the more portable version.

like image 134
David R Tribble Avatar answered Nov 15 '22 18:11

David R Tribble


The standard does not require the NULL pointer to evaluate to the bit pattern 0 but can evaluate to a platform specific value.

Doing the subtraction guarantees that when converted to an integer value, NULL is 0.

like image 28
R Samuel Klatchko Avatar answered Nov 15 '22 17:11

R Samuel Klatchko