I have an instance of std::u16string
, can I pass its c_str()
to a Win32 APIs which expects LPCWSTR
, without any kind of conversion? For example, can I safely do this:
auto u16s = std::u16string(u"Hello");
::SetWindowTextW(hWnd, reinterpret_cast<LPCWSTR>(u16s.c_str()));
Updated, MSDN says here wchar_t
is UTF-16LE, while char16_t
is just UTF-16 with no endian specified. Is it safe to assume then that char16_t
is also always UTF-16LE on Windows? Or would that be MSVC compiler specific, and so it can possibly be UTF-32LE (or maybe UTF-16BE) if I complile with GCC, for example?
I would like to make amends to @jamesdlin's anwser though his answer is correct.
Before C++11, there are char
and wchar_t
, and hence specialize std::basic_string<>
to std::string
and std::wstring
.
However, the width in bits of wchar_t
is platform-specific: on Windows it is 16-bit while on other platforms, its 32-bit.
And with the advent of C++11, the standard adds char16_t
to represent 16-bit wide characters; thus on Windows, std::u16string
happens to be interchangable with std::wstring
in most contexts, because they are both able to represent 16-bit wide characters.
The wchar_t type is an implementation-defined wide character type. In the Microsoft compiler, it represents a 16-bit wide character used to store Unicode encoded as UTF-16LE, the native character type on Windows operating systems.
But the newest MSDN seems to add some aside notes for code using std::wstring
yet intend to be portable:
The size of wchar_t is implementation-defined. If your code depends on wchar_t to be a certain size, check your platform's implementation (for example, with sizeof(wchar_t)). If you need a string character type with a width that is guaranteed to remain the same on all platforms, use string, u16string, or u32string.
As for LE(little-endian), it should be architecture-specific, IIRC. and most architectures today use LE.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With