ios - Parameter type UnsafePointer<Void> in glDrawElements -


i trying context (eaglcontext) render in swift; days haven't been able function gldrawelements work.

i have read couple of similar questions here on stackoverflow no avail.

my gldrawelements follows:

gldrawelements(glenum(gl_triangles), glsizei(indices.count), glenum(gl_unsigned_byte), &offset) 

i having problem last parameter - offset, expects unsafepointer<void>.

i have tried following:

let offset: cconstvoidpointer = copaquepointer(unsafepointer<int>(0)) 

the above no longer works, because cconstvoidpointer doesn't seem available anymore in swift 1.2.

and:

var offset = unsafepointer<int>(other: 0) 

the above gives warning should use bitpattern: instead.

although don't believe bitpattern: should used here (because parameter expects word), decided give try according suggestion provided such following:

var offset = unsafepointer<int>(bitpattern: 0) gldrawelements(glenum(gl_triangles), glsizei(indices.count), glenum(gl_unsigned_byte), &offset) 

i exe_bad_access (code=exc_i386_gpflt) error.

in vain, tried simple following using:

var offsetzero : int = 0 

and subsequently feeding last parameter of gldrawelements so:

gldrawelements(glenum(gl_triangles), glsizei(indices.count), glenum(gl_unsigned_byte), &offsetzero) 

i getting same exe_bad_access in case above.

how can form type suitable last parameter of gldrawelements expects type unsafepointer<void>?


@robmayoff

update (adding code vbo set-up , variables declarations , definitions):

struct vertex {     var position: (cfloat, cfloat, cfloat)     var color: (cfloat, cfloat, cfloat, cfloat) }  var vertices = [     vertex(position: (-1, -1, 0) , color: (1, 1, 0, 1)),     vertex(position: (1, -1, 0)  , color: (1, 1, 1, 1)),     vertex(position: (-1, 0, 1) , color: (0, 1, 0, 1)),     vertex(position: (-1, 1, 1), color: (1, 0, 0, 1)) ]  let indices: [uint8] = [ 0, 2, 3, 3, 1, 0 ]  class mainglview : uiview {     var layer : caeagllayer!     var context : eaglcontext!     var cbuffer : gluint = gluint()     var pos : gluint = gluint()     var color : gluint = gluint()     var ibuffer : gluint = gluint()     var vbuffer : gluint = gluint()     var vao : gluint = gluint()      override class func layerclass() -> anyclass      {           return caeagllayer.self     }      required init(coder adecoder: nscoder)     {         super.init(coder: adecoder)         //setting context, buffers, shaders, etc.          self.configurevbo()         self.setuprendering()     }      func configurevbo()     {          glgenvertexarraysoes(1, &vao);         glbindvertexarrayoes(vao);          glgenbuffers(glsizei(1), &vbuffer)         glbindbuffer(glenum(gl_array_buffer), vbuffer)         glbufferdata(glenum(gl_array_buffer), vertices.size(), vertices, glenum(gl_static_draw))          glenablevertexattribarray(pos)           var ptr = copaquepointer(unsafepointer<int>(bitpattern: 0))          glvertexattribpointer(gluint(pos), glint(3), glenum(gl_float), glboolean(gl_false), glsizei(sizeof(vertex)), &ptr)          glenablevertexattribarray(color)          let fcolor = unsafepointer<int>(bitpattern: sizeof(float) * 3)          glvertexattribpointer(gluint(color), 4, glenum(gl_float), glboolean(gl_false), glsizei(sizeof(vertex)), fcolor)          glgenbuffers(1, &ibuffer)         glbindbuffer(glenum(gl_element_array_buffer), ibuffer)         glbufferdata(glenum(gl_element_array_buffer), indices.size(), indices, glenum(gl_static_draw))          glbindbuffer(glenum(gl_array_buffer), 0)         glbindvertexarrayoes(0)     }      func setuprendering()     {         glbindvertexarrayoes(vao);         glviewport(0, 0, glint(self.frame.size.width), glint(self.frame.size.height));          indices.withunsafebufferpointer         {             (pointer : unsafebufferpointer<uint8>) -> void in             gldrawelements(glenum(gl_triangles), glsizei(indices.count), glenum(gl_unsigned_byte), unsafepointer<void>(pointer.baseaddress))             void()         }          self.context.presentrenderbuffer(int(gl_renderbuffer))          glbindvertexarrayoes(0)     } } 

because of choices count , type arguments, indices argument (the last argument) needs pointer first element of array, of (at least) length indices.count, of unsigned bytes. pointer needs converted unsafepointer<void> in swift.

you didn't show declaration of indices. let's assume it's declared this:

let indices: [uint8] = [ 0, 1, 2, 3, 4, 5, 6, 7, 8 ] 

then can call gldrawelements jumping through these hoops:

indices.withunsafebufferpointer { (pointer: unsafebufferpointer<uint8>) -> void in     gldrawelements(glenum(gl_triangles), glsizei(indices.count),         glenum(gl_unsigned_byte),         unsafepointer<void>(pointer.baseaddress))     void() } 

Comments

Popular posts from this blog

asp.net mvc - SSO between MVCForum and Umbraco7 -

Python Tkinter keyboard using bind -

ubuntu - Selenium Node Not Connecting to Hub, Not Opening Port -