OpenGL Blending with textures converted from SDL_Surface

0

I wanted to try making a game with OpenGL and GLUT, but as it turns out, GLUT is not well adapted to making games. So I switched to using SDL 1.2 (this is for a sort of competition, so I can't use SDL 2). When I saw I could use OpenGL within SDL, I decided to do that, since I had already written a majority of my code with OpenGL. Now, I'm having issues trying to load an image into an SDL_Surface and then converting it to an OpenGL texture, with OpenGL blending enabled. Here is the code I'm using (loadImage loads an SDL_Surface & loadTexture loads into an OpenGL texture):

SDL_Surface * Graphics::loadImage(const char * filename) {
    SDL_Surface *loaded = nullptr;
    SDL_Surface *optimized = nullptr;

    loaded = IMG_Load(filename);

    if (loaded) {
        optimized = SDL_DisplayFormat(loaded);
        SDL_FreeSurface(loaded);
    }

    return optimized;
}

GLuint Graphics::loadTexture(const char * filename, GLuint oldTexId) {
    //return SOIL_load_OGL_texture(filename, SOIL_LOAD_AUTO, oldTexId, SOIL_FLAG_NTSC_SAFE_RGB | SOIL_FLAG_MULTIPLY_ALPHA);
    GLuint texId = 0;
    SDL_Surface *s = loadImage(filename);
    if (!s) return 0;

    if (oldTexId) glDeleteTextures(1, &oldTexId);

    glGenTextures(1, &texId);
    glBindTexture(GL_TEXTURE_2D, texId);

    int format;
    if (s->format->BytesPerPixel == 4) {
        if (s->format->Rmask == 0x000000ff)
            format = GL_RGBA;
        else
            format = GL_BGRA;
    } else if (s->format->BytesPerPixel == 3) {
        if (s->format->Rmask == 0x000000ff)
            format = GL_RGB;
        else
            format = GL_BGR;
    }

    glTexImage2D(GL_TEXTURE_2D, 0, s->format->BytesPerPixel, s->w, s->h, 0, format, GL_UNSIGNED_BYTE, s->pixels);

    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

    SDL_FreeSurface(s);

    return texId;
}

I've been searching online for a solution to this issue quite a bit, and none of the solutions I found worked. This code actually works when I don't glEnable(GL_BLEND), but when I do enable it, it doesn't show anything on screen anymore. I am fairly new to OpenGL, and I'm not sure I'm using the glTexImage2D correctly.

The way I was loading images before I converted to SDL was using the SOIL library, and when I replace the loadTexture function's body with that commented out first line, it actually works fine, but I'd rather have less external libraries, and do everything graphics-side with SDL & OpenGL.

c++
opengl
sdl
asked on Stack Overflow Mar 9, 2018 by Arthur Bouvier • edited Mar 10, 2018 by genpfault

1 Answer

0

The third argument of glTexImage2D is wrong:

glTexImage2D(GL_TEXTURE_2D, 0, s->format->BytesPerPixel, s->w, s->h, 0, format, GL_UNSIGNED_BYTE, s->pixels);

The third argument is internalFormat and must be one of the base internal formats:

GL_DEPTH_COMPONENT
GL_DEPTH_STENCIL
GL_RED
GL_RG
GL_RGB
GL_RGBA

Or one of the sized internal formats, which specifies the bits per channel.

So in other words your third argument should be either:

GL_RGB 
GL_RGB8 
GL_RGBA 
GL_RGBA8 

If you're using an 8 bit per channel texture.

Whereas the 7th argument, format, can be either RGB or BGR, (including the alpha version), the third argument, internalFormat can only be RGB and not the other way around.

So where you check the red mask and change the format is still good for the 7th argument, the third argument (internalFormat) should be either GL_RGB or GL_RGBA. Or optionally the sized version GL_RGB8 or GL_RGBA8.

glTexImage2D(GL_TEXTURE_2D, 0, /*GL_RGB or GL_RGBA*/, s->w, s->h, 0, format, GL_UNSIGNED_BYTE, s->pixels);

Docs

answered on Stack Overflow Mar 10, 2018 by Zebrafish

User contributions licensed under CC BY-SA 3.0