openGL 4, translate and scale going wrong

2

I’m trying to do an ortho projection onto a plane, which represents a map – think “floor plan”. I’m running into trouble because openGL 4 is new to me (I last used 1.1, and the world has changed) and because what I’m trying to do isn’t much like common examples online. My problem is scaling and translating.

The data that describes the map is a series of lines with endpoints are in what I’ll call “dungeon coordinates units”. When I render the image I want to have a fixed rule of “1 unit is 1 pixel”.

My coordinates are all in the first quadrant, with (0,0) representing the lower left of the map. I’d like (0,0) to show up in the lower left of the screen.

Now for the tricky bits. When I render the “floor” in the fragment shader, I’m being handed gl_FragCoord, which is ideal. It’s effectively a pixel location, which means for my purposes it is equivalent to a dungeon coordinate. I can look up all the information I passed to the shader (also in dungeon coordinates) and figure out how to paint (or discard) that pixel. It works, except… it draws (0,0) is in the center of the screen, not the low left.

Worse, There are some things, like lines (“walls”), that I render with skinny triangles in dungeon coordinates in a second pass. They don’t show up where I want them. (In fact I’m pretty sure that the triangles I’m using to tile the floor are also wrong and are only covering the screen by coincidence.)

I really, really need openGL to use a coordinate system that puts 0,0 at the lower left of the image and lets me specify triangle vertices in my units, which happen to map straight to pixels.

This seems like a simple case of scaling and translating. But I’m obviously applying the scale and translate incorrectly.

The vertex code is simple:

#version 430
layout (location = 0) in vec3 Position;
uniform mat4 gWorld;
out vec4 Color; //unused; the fragment shader caslculates all colors

void main()
{
    gl_Position = gWorld * vec4(Position, 1.0);
}

Building the 2 triangles for the map floor (a simple rectangle for now) seems simple:

Vector3f Vertices[4];
Vertices[0] = Vector3f(0.f, 0.f, 0.0f);
Vertices[1] = Vector3f(0.f, mapEdges.maxs.y, 0.0f);
Vertices[2] = Vector3f(mapEdges.maxs.x, 0.f, 0.0f);
Vertices[3] = Vector3f(mapEdges.maxs.x, mapEdges.maxs.y, 0.0f);
glGenBuffers(1, &VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
unsigned int Indices[] = { 0, 1, 2,
                        1, 2, 3 };
glGenBuffers(1, &IBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices), Indices, GL_STATIC_DRAW);

and I use an indexed draw for them.

The C++ code (using glm) sets up the world matrix:

glUseProgram(ShaderProgram); //this selects the shader
gWorldLocation = glGetUniformLocation(ShaderProgram, "gWorld");
assert(gWorldLocation != 0xFFFFFFFF);

...and when rendering…

//try to fix openGL’s desire to think my buffer is -1 to 1 across
float scale = 1/1024.f; //test map is about 1024 units across
glm::mat4 sm = glm::scale(
    glm::mat4( 1.0f ),             
    glm::vec3( scale, scale, 1.0f )
    );
glm::mat4 ts = glm::translate(
    sm,
    glm::vec3( -512.0f, -512.0f, 0.0f ) //shove left and down
    );

glUniformMatrix4fv(gWorldLocation, 1, GL_TRUE, &ts[0][0]);

Since my test map is about 1024 units across, I’d have thought this would have shoved things into position. But no. The floor (which, remember, is using gl_FragCoord to decide where and what to draw) is painted from screen center and up and right, though it otherwise looks as I’d expect. The walls, which are painted by skinny triangles in dungeon coordinates, are nowhere to be seen, probably scaled off into the aether somewhere.

Basically I’m not convincing openGL that I want x=0 to be the left edge of the image and my scaling is obviously completely wrong. Sadly I had one version that (incorrectly) drew some walls on the screen at one point, but I don’t have that code anymore. Still, it tells me that I’m not completely off in generating the walls, just laying them down.

How do I get openGL to use my units?

c++
opengl
glm-math
asked on Stack Overflow May 22, 2021 by Scott M • edited May 22, 2021 by genpfault

1 Answer

1

You transpose the matrix when you set the matrix uniform. Since the vector is multiplied to the matrix from the right in your shader program, this is wrong. See GLSL Programming/Vector and Matrix Operations

glUniformMatrix4fv(gWorldLocation, 1, GL_TRUE, &ts[0][0]);

glUniformMatrix4fv(gWorldLocation, 1, GL_FALSE, &ts[0][0]);

Instead of scaling and translating the vertices you can set an orthographic projection with matrix with glm::ortho:

glm::mat4 projection = glm::ortho(0.0f, 1024.0f, 0.0f, 1024.0f, -1.0f, 1.0f);
glUniformMatrix4fv(gWorldLocation, 1, GL_FALSE, glm::value_ptr(projection));
answered on Stack Overflow May 22, 2021 by Rabbid76 • edited May 22, 2021 by Rabbid76

User contributions licensed under CC BY-SA 3.0