I am trying to compile a program that compiles perfectly fine on my desktop but on my laptop, it compiles but gives me this error whenever it is run:
Windows has triggered a breakpoint in RR.exe.
This may be due to a corruption of the heap, which indicates a bug in RR.exe or any of the DLLs it has loaded.
This may also be due to the user pressing F12 while RR.exe has focus.
The output window may have more diagnostic information.
I've commented out lines till I found the line that makes the error which is:
if(glfwOpenWindow(width_, height_, 0, 0, 0, 0, 32, 0, GLFW_WINDOW) != GL_TRUE) {
throw std::runtime_error("Unable to open GLFW window");
}
The weird thing is if I replace width_
and height_
with constants, e.g. 800 and 600 respectively, it stops the heap corruption. Also if I just use the default values set by the constructor instead of passing values it doesn't crash.
Here's the complete code. The above lines are in the Window
constructor.
window.h
#pragma once
#include <iostream>
#include <GL\glew.h>
#include <GL\glfw.h>
#pragma comment(lib, "opengl32.lib")
#pragma comment(lib, "glu32.lib")
#pragma comment(lib, "glew32.lib")
#pragma comment(lib, "GLFW.lib")
class Window {
public:
Window(unsigned width = 800, unsigned height = 600);
~Window();
void clear();
inline void display() { glfwSwapBuffers(); }
inline bool exit() { return !glfwGetWindowParam(GLFW_OPENED); }
private:
unsigned width_, height_;
};
window.cpp
#include "window.h"
Window::Window(unsigned width, unsigned height) : width_(width), height_(height) {
if(glfwInit() != GL_TRUE) {
throw std::runtime_error("Unable to initialize GLFW");
}
if(glfwOpenWindow(width_, height_, 0, 0, 0, 0, 32, 0, GLFW_WINDOW) != GL_TRUE) { //crash
//if(glfwOpenWindow(800, 600, 0, 0, 0, 0, 32, 0, GLFW_WINDOW) != GL_TRUE) { //no crash
throw std::runtime_error("Unable to open GLFW window");
}
GLenum result = glewInit();
if(result != GLEW_OK) {
std::stringstream ss;
ss << "Unable to initialize glew: " << glewGetErrorString(result);
throw std::runtime_error(ss.str());
}
}
Window::~Window() {
glfwTerminate();
}
void Window::clear() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
}
main.cpp
#include "window.h"
int main() {
Window wind(1024, 800); //crash
Window wind(800, 600); //crash
Window wind(); //works
return 0;
}
The problem seems lead with glfw:
I assume, you are trying to use dynamically linked GLFW
. Note in glfw header:
#if defined(_WIN32) && defined(GLFW_BUILD_DLL)
/* We are building a Win32 DLL */
#define GLFWAPI __declspec(dllexport)
#define GLFWAPIENTRY __stdcall
#define GLFWCALL __stdcall
#elif defined(_WIN32) && defined(GLFW_DLL)
/* We are calling a Win32 DLL */
#if defined(__LCC__)
#define GLFWAPI extern
#else
#define GLFWAPI __declspec(dllimport)
#endif
#define GLFWAPIENTRY __stdcall
#define GLFWCALL __stdcall
#else
/* We are either building/calling a static lib or we are non-win32 */
#define GLFWAPIENTRY
#define GLFWAPI
#define GLFWCALL
#endif
GLFW_BUILD_DLL
apparently was set while building dll, and it defined API functions with __stdcall
calling conversion.
But when using library you haven't defined GLFW_DLL
, so your code assumed __cdecl
calling conversion. The difference between _cdecl
and __stdcall
in general is that caller function should clean the stack in first and callee in last case. So you cleaned the stack twice, that's why you got stack corruption.
After I defined GLFW_DLL
before including glfw
in your program, it started working correctly. Also note, that I used mingw and had to link against glfwdll.a
instead of glfw.a
after defining GLFW_DLL
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With