OpenGL: Zero active uniforms


I'm referring to the OpenGL SuperBible. I use their framework to create an own program. I wanted to do something with an Interface Block (specifically a Uniform Block). If I call

glGetActiveUniformsiv(program, 1, uniformIndices, GL_UNIFORM_OFFSET, uniformOffsets);

I get an error, namely GL_INVALID_VALUE. But if I call the same function with a 0 instead of a 1, it doesn't make that error. I assumed then, that I have no active uniforms. I should have 3 of them, however. How do I activate them? Here's my shader:

        #version 450 core                        
        layout (location = 0) in vec4 position;  
        layout (location = 1) in vec4 color;     
        out vec4 vs_color;                       
        uniform TransformBlock {                
            mat4 translation;                    
            mat4 rotation;                        
            mat4 projection_matrix;             
        void main(void)                          
            mat4 mvp = projection_matrix * translation * rotation ;    
            gl_Position = mvp * position;  
            vs_color = color;                   

Here, I'll provide you with an essential code snippet from my "startup"-method:


    vmath::mat4 transl_matrix = vmath::translate(0.0f, 0.0f, -3.0f);
    vmath::mat4 rot_matrix = vmath::rotate( 30.0f, 0.0f, 0.5f, 1.0f);
    vmath::mat4 proj_matrix = vmath::perspective(60.0f, (float)info.windowWidth / (float)info.windowHeight, 0.1f, 10.0f);


    static const GLchar * uniformNames[3] = {
    GLuint uniformIndices[3];

    glGetUniformIndices(program, 3, uniformNames, uniformIndices);

    GLint uniformOffsets[3];
    GLint matrixStrides[3];
    glGetActiveUniformsiv(program, 3, uniformIndices, GL_UNIFORM_OFFSET, uniformOffsets);
    glGetActiveUniformsiv(program, 3, uniformIndices, GL_UNIFORM_MATRIX_STRIDE, matrixStrides);

    unsigned char * buffer1 = (unsigned char*)malloc(4096);
    //GLuint * buffer1 = (GLuint *) malloc(4096);

    int j;
    for (int i = 0; i < 4; ++i) {
        GLuint offset = uniformOffsets[0] + matrixStrides[0] * i;
        for (j = 0; j < 4; ++j) {
            *((float *)(buffer1 + offset)) = transl_matrix[i][j];// 
            offset += sizeof(GLfloat);

    for (int i = 0; i < 4; ++i) {
        GLuint offset = uniformOffsets[1] + matrixStrides[1] * i;
        for (j = 0; j < 4; ++j) {
            *((float *)(buffer1 + offset)) = rot_matrix[i][j];
            offset += sizeof(GLfloat);

    for (int i = 0; i < 4; ++i) {
        GLuint offset = uniformOffsets[2] + matrixStrides[2] * i;
        for (j = 0; j < 4; ++j) {
            *((float *)(buffer1 + offset)) = proj_matrix[i][j];
            offset += sizeof(GLfloat);

    GLuint block_index = glGetUniformBlockIndex(program, "TransformBlock");
    glUniformBlockBinding(program, block_index, 0);
    glBindBufferBase(GL_UNIFORM_BUFFER, 0, (GLuint)buffer1);

I hereby refer to the 5th Chapter (called "Data") of the above mentioned SuperBible.

However, as a consequence of the function returning GL_INVALID_VALUE there's an error with the calls:

*((float *)(buffer1 + offset)) = ...

and the whole program interrupts. Without adding the offset, I don't get an error here, so I think the second error depends on the first error.

Thanks for any answers.


The task I gave you to solve now works. I have changed the uniformNames variable. I now have them without the TransformBlock keyword. However, I now get a different error, in my render()-function, with glDrawArrays. I'll show my render()-function here:

virtual void render(double currentTime) {
    const GLfloat color[] = {(float)sin(currentTime)*0.5f + 0.5f, (float)cos(currentTime)*0.5f + 0.5f, 0.0f, 1.0f};
    glClearBufferfv(GL_COLOR, 0, color);
    glDrawArrays(GL_TRIANGLES, 0, 3);

The error message was something like this:

Access violation by reading at position 0x00000125.

Could that be because not everything else is working or do I have to call a different drawing function, instead of glDrawArrays?

asked on Stack Overflow Jun 30, 2018 by F. Leuthold • edited Jun 30, 2018 by F. Leuthold

1 Answer


I think it goes wrong at glGetUniformIndices, because you prefixed your uniform names with TransformBlock. You don't use that to access the uniforms with that prefix in the GLSL code, either. If you wanted that, you'd had to set an instance name for the uniform block, the block name is not relevant for accessing / naming the uniforms at all. It is only used for matching interfaces if you link together multiple shaders accessing the same interface block.

answered on Stack Overflow Jun 30, 2018 by derhass

User contributions licensed under CC BY-SA 3.0