I have a short question. Why does OpenGL come with its own datatypes for standard types like int, unsigned int, char, and so on? And do I have to use them instead of the build in C++ datatypes?
For example the OpenGL equivalent to unsigned int
is GLuint
and for a c string there is GLchar*
instead of char*
.
OpenGL (Open Graphics Library) is a cross-language, cross-platform application programming interface (API) for rendering 2D and 3D vector graphics.
OpenGL is platform independent, rather than cross platform. It is just a specification for the interface of a graphics library. It has no concern for the platform it is being implemented on. It just describes functions, what they're called and what they do.
Open Graphics Library (OpenGL) is a cross-language (language independent), cross-platform (platform-independent) API for rendering 2D and 3D Vector Graphics(use of polygons to represent image). OpenGL API is designed mostly in hardware.
For example the OpenGL equivalent to
unsigned int
isGLuint
No it isn't, and that's exactly why you should use OpenGL's data types when interfacing with OpenGL.
GLuint
is not "equivalent" to unsigned int
. GLuint
is required to be 32 bits in size. It is always 32-bits in size. unsigned int
might be 32-bits in size. It might be 64-bits. You don't know, and C isn't going to tell you (outside of sizeof
).
These datatypes will be defined for each platform, and they may be defined differently for different platforms. You use them because, even if they are defined differently, they will always come out to the same sizes. The sizes that OpenGL APIs expect and require.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With