My vertices are interleaved in a numpy array (dtype = float32) like this: … tu, tv, nx, ny, nz, vx, vy, vz, …
When rendering, I’m calling gl*Pointer() like this (I have enabled the arrays before):
stride = (2 + 3 + 3) * 4
glTexCoordPointer( 2, GL_FLOAT, stride, self.vertArray )
glNormalPointer( GL_FLOAT, stride, self.vertArray + 2 )
glVertexPointer( 3, GL_FLOAT, stride, self.vertArray + 5 )
glDrawElements( GL_TRIANGLES, len( self.indices ), GL_UNSIGNED_SHORT, self.indices )
The result is that nothing renders. However, if I organize my array so that the vertex position is the first element ( … vx, vy, vz, tu, tv, nx, ny, nz, … ) I get correct positions for vertices while rendering but texture coords and normals aren’t rendered correctly.
This leads me to believe that I’m not setting the pointer offset right. How should I set it? I’m using almost the exact same code in my other app in C++ and it works.
In python, you can’t do pointer arithmetic. What you’re trying to do only works for C/C++.
With normal Python list:
With numpy arrays:
See how neither does what you want: starting the array at a certain position.
I think you have basically two options:
Like this:
Combine this with a correct stride, you can probably get it to work.