I've been looking at the command line generated by Visual Studio, and for one of my project it defines two symbols: _UNICODE
and UNICODE
. Now if I understand this document this rather old document, the _UNICODE
symbol is a VC++ thing that causes certain standard functions to use wchar_t
instead of char
in their interfaces.
But what does the UNICODE
without an underscore mean?
In text processing, Unicode takes the role of providing a unique code point—a number, not a glyph—for each character. In other words, Unicode represents a character in an abstract way and leaves the visual rendering (size, shape, font, or style) to other software, such as a web browser or word processor.
you can go to project properties --> configuration properties --> General -->Project default and there change the "Character set" from "Unicode" to "Not set".
Raymond Chen explains it here: TEXT vs. _TEXT vs. _T, and UNICODE vs. _UNICODE:
The plain versions without the underscore affect the character set the Windows header files treat as default. So if you define
UNICODE
, thenGetWindowText
will map toGetWindowTextW
instead ofGetWindowTextA
, for example. Similarly, theTEXT
macro will map toL"..."
instead of"..."
.The versions with the underscore affect the character set the C runtime header files treat as default. So if you define
_UNICODE
, then_tcslen
will map towcslen
instead ofstrlen
, for example. Similarly, the_TEXT
macro will map toL"..."
instead of"..."
.
Looking into Windows SDK you will find things like this:
#ifdef _UNICODE #ifndef UNICODE #define UNICODE #endif #endif
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With