Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL EXC_BAD_ACCESS when calling glDrawElements in Swift but not in Objective-C

I'm working through the OpenGL for iOS tutorial by Ray Wenderlich in an attempt to convert his code from Objective-C to Swift.

I am very new to OpenGL and to Swift and believe my problem has to do with how I have translated the Objective-C. Here's why:

In my swift file for setting up my view that contains OpenGL content, on the final logical step (calling glDrawElements) the app will crash with a EXC_BAD_ACCESS alert. If, however, I move this portion of the code to an Objective-C file, the app works as expected.

Swift version of this code:

var positionDataOffset: Int = 0
glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(), 
    GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), 
    VertexDataSource.sizeOfVertex(), &positionDataOffset)

var colorDataOffset = (sizeof(Float) * 3) as AnyObject
glVertexAttribPointer(self.positionSlot, 4 as GLint, GL_FLOAT.asUnsigned(), 
    GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), 
    VertexDataSource.sizeOfVertex(), VertexDataSource.vertexBufferOffset())

var vertexOffset: Int = 0
glDrawElements(GL_TRIANGLES.asUnsigned(), VertexDataSource.vertexCount(),
    GL_UNSIGNED_BYTE.asUnsigned(), &vertexOffset)

And here is the Objective-C version:

glVertexAttribPointer(position, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0);
glVertexAttribPointer(color, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), 
    (GLvoid*) (sizeof(float) * 3));

glDrawElements(GL_TRIANGLES, sizeof(Indices)/sizeof(Indices[0]), GL_UNSIGNED_BYTE, 0);

As you can see, the Swift is much more verbose... I'm new to this like everyone else. :)

One other note: In the Swift version, youll see several calls to class methods on the class VertexDataSource. Essentially, I couldn't for the life of me determine how to convert some portions of the Objective-C to swift, so decided to (FOR NOW) create a small class in Objective-C that could supply the Swift code with those attributes. Here are those methods in Objective-C:

+ (GLint)sizeOfVertex {
    return sizeof(Vertex);
}

+ (GLint)sizeOfIndices {
    return sizeof(Indices);
}

+ (GLint)sizeOfIndicesAtPositionZero {
    return sizeof(Indices[0]);
}

+ (GLint)sizeOfVertices {
    return sizeof(Vertices);
}

+ (GLvoid *)vertexBufferOffset {
    return (GLvoid *)(sizeof(float) * 3);
}

+ (GLint)vertexCount {
    return self.sizeOfIndices / sizeof(GLubyte);
}

Any help translating those lines to Swift would be amazing.

EDIT #1

As Reto Koradi pointed out, the Swift code above references self.positionSlot twice rather than using the colorSlot. This was a mistake I made when posting the code here and not actually a mistake in my code.

So the problem still exists.

Updated Swift:

var positionDataOffset: Int = 0
glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(),
    GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)),
    VertexDataSource.sizeOfVertex(), &positionDataOffset)
var colorDataOffset = (sizeof(Float) * 3) as AnyObject
glVertexAttribPointer(self.colorSlot, 4 as GLint, GL_FLOAT.asUnsigned(), 
    GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), 
    VertexDataSource.sizeOfVertex(), VertexDataSource.vertexBufferOffset())

var vertexOffset: Int = 0
glDrawElements(GL_TRIANGLES.asUnsigned(), VertexDataSource.vertexCount(), 
    GL_UNSIGNED_BYTE.asUnsigned(), &vertexOffset)

EDIT #2: Solved

I ended up solving this. The problem in my case was that my conversion of the Objective-C to Swift was incorrect in several cases. For brevity I'll post the final version of the Swift code for the portion I was originally concerned about, but you can view the full source code of the working result here in this example GitHub repo.

The final Swift code:

let positionSlotFirstComponent: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(0))
glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(), 
    GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), Int32(sizeof(Vertex)), 
    positionSlotFirstComponent)
let colorSlotFirstComponent: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(sizeof(Float) * 3))
glVertexAttribPointer(self.colorSlot, 4 as GLint, GL_FLOAT.asUnsigned(),
    GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), Int32(sizeof(Vertex)), 
    colorSlotFirstComponent)


let vertextBufferOffset: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(0))
glDrawElements(GL_TRIANGLES.asUnsigned(), Int32(GLfloat(sizeofValue(Indices)) / 
    GLfloat(sizeofValue(Indices.0))), GL_UNSIGNED_BYTE.asUnsigned(), vertextBufferOffset)

I'm going to go ahead and accept Reto Koradi's answer, as it certainly got me on the right track.

like image 319
bradleygriffith Avatar asked Oct 21 '22 06:10

bradleygriffith


2 Answers

For people who are using swift 2.1.1 with Xcode 7.2 like me, the pointer syntax has changed. Here is an example about how to use it.

https://github.com/asymptotik/asymptotik-rnd-scenekit-kaleidoscope/blob/master/Atk_Rnd_VisualToys/ScreenTextureQuad.swift

Related code is quoted below:

// bind VBOs for vertex array and index array
// for vertex coordinates

let ptr = UnsafePointer<GLfloat>(bitPattern: 0)

glBindBuffer(GLenum(GL_ARRAY_BUFFER), self.vertexBufferObject)

glEnableVertexAttribArray(self.positionIndex)
glVertexAttribPointer(self.positionIndex, GLint(3), GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(sizeof(GLfloat) * 8), ptr)
glEnableVertexAttribArray(self.textureCoordinateIndex)
glVertexAttribPointer(self.textureCoordinateIndex, GLint(2),  GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(sizeof(GLfloat) * 8), ptr.advancedBy(6))

Hope this helps. I also fixed OP's example github project in my fork here: https://github.com/zwang/iOSSwiftOpenGL

like image 171
Zhao Avatar answered Nov 03 '22 02:11

Zhao


I don't really know the languages, but one obvious difference is that the Objective-C version sets up two different vertex attributes, while the Swift version sets up the same attribute twice:

glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(), ...)
glVertexAttribPointer(self.positionSlot, 4 as GLint, GL_FLOAT.asUnsigned(), ...)

The first argument, which determines the attribute location, is the same in both cases.

It also looks kind of odd to me that you're passing what looks like the address of a variable as the last argument of glVertexAttribPointer() in the second call, and as the last argument of glDrawElements(). But maybe the & operator means something different in Swift than in the languages I'm used to.

like image 41
Reto Koradi Avatar answered Nov 03 '22 01:11

Reto Koradi