the fonts I draw in OpenGL by using wglUseFontBitmaps take equal space (width) for every letter, so a "." needs as much space as an "M" for example. I have changed the pitch paramter in font creation to VARIABLE_PITCH
but it doesnt change anything. Can it actually be done with wglUseFontBitmaps or is it its nature that every generated bitmap takes equal space?
Furthermor I am querying the width/height of the rastered bitmap text with GetTextExtentPoint32W(). The width returned is ok but the height is always too big. I am drawing a bright rectangle behind the black text to make it always readable within the 3d scene. Why is the queried height so big? Is the space reserved for higher characters something like this "É"?
My Setup:
HFONT font; // Windows Font ID
font = CreateFont( -12, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
FW_EXTRALIGHT, // Font Weight
FALSE, // Italic
FALSE, // Underline
FALSE, // Strikeout
ANSI_CHARSET, // Character Set Identifier
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE | VARIABLE_PITCH, // Family And Pitch
(LPCWSTR)L"Arial"); // Font Name
SelectObject(this->hDeviceContext, font);
//init display lists for text drawing
bool createFontLists = wglUseFontBitmaps(this->hDeviceContext, 0, 255, 1000);
I solved the problem myself: It only looks like the font draws every letter with equal space or monospaced, but actually OpenGL drawed a space " " between every character. The problem was the usage of std::wstring
for calling the displaylists generated with wglUseFontBitmaps
.
Wrong drawing looked like this:
int length = wcslen(text.c_str());
glCallLists (sizeof(wchar_t) * length, GL_UNSIGNED_BYTE, text.c_str());
The first parameters (number of lists to be called) is wrong. It has to be length
only. The second parameter, telling OpenGL how to interpret every character in the text.c_str as an offset number defining the next display list to call is wrong here. It has to be GL_UNSIGNED_SHORT which is 2Bytes from 0-65535. With GL_UNSIGNED_BYTE every '0' used for the second byte in the wstring for the 2-Byte character was interpreted as a space. So the correct call is:
glCallLists (length, GL_UNSIGNED_SHORT, text.c_str());
This works for my case, but is not a general way to use unicode charaters with wglUseFontBitmaps. It only safely covers the ASCII characters coming from a wstring here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With