While walking through a sample code application for the iPhone, I found a pair of send/receive methods that effectively had the same signature definitions, with the exception of the value types associated for one of the variables.
From within the header:
- (void)receivedData: (unsigned char *)data length:(NSUInteger)len;
- (void)sendData: (uint8_t*) data length:(NSUInteger) len;
These methods are used as wrappers for a sending/receiving process, which effectively is passing around a pointer to a byte
array of data being written to and from data streams. I found these method signatures to be a bit curious and since I'm new to Cocoa/Cocoa Touch dev, I decided to check out the definition of the uint8_t
type. I discovered that the uint8_t
is defined as an unsigned char
within stdint.h
and, therefore, the data
variables for these methods are exactly the same. At least, that is the case for the stdint.h
that is being linked within XCode 4.2.
However, doing a bit of further research regarding the uint8_t
type I found this question regarding uint8_t
vs. unsigned char
usage. The consensus seems to be that quite often these two value types are exactly the same but with some implementations of the C standard libraries, they might be different. Ergo, one should not trust that they will be the same type of data when generating portable code.
With that said, is it safe to assume from within an Apple/Objective-C programming environment that the uint8_t
will be the same as an unsigned char
or should I follow the same advice given within the above mentioned question?
This may seem like a picky question to ask, but since I may be integrating libraries where this type of coding mis-conduct appears to be a bit prevalent into a personal codebase that could be used in multiple, Apple environments (for quite a few years to come), I wanted further commentary.
Ignoring questions of portability (as you have implicitly asked us to do), it seems exceptionally unlikely that char will ever be anything other than an eight bit value under Mac OS X and its derivatives such as iOS. I think you can safely assume that unsigned char and uint8_t will be the same forever.
That said, I think as a form of documentation for the programmers to come after you, when you are dealing with a value meant to hold a byte of binary data and not a character, it seems smarter to use "byte" or "uint8_t" or some similar method of indicating to future readers that the intent of a function is to treat a value as a byte and not as a character per se.
It is safe to assume uint8_t
is a typedef to unsigned char
on Mac OS X.
For systems that follow POSIX (as Mac OS X), POSIX requires char
type to be 8-bit.
(2.12.2 The char Type) "The type char is defined as a single byte; see XBD Definitions (Byte and Character)."
and
(3.84 Byte) "An individually addressable unit of data storage that is exactly an octet, used to store a character or a portion of a character; see also Character . A byte is composed of a contiguous sequence of 8 bits."
While the C99 standard allows uint8_t
and unsigned char to be different, in practice, every mainstream operating system - especially UNIX-like systems such as Apple OS family - will have them defined to be the same, for the simple reason that there is just way too much code (pre- and post-C99) that assumes unsigned char
to be exactly 8 bits wide.
That being said, uint8_t
is naturally safer from a theoretical standpoint, so there is not much reason not to use it. You don't need to fear any library using unsigned char
, though.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With