OpenGL Textures with SDL

Whether you're a newbie or an experienced programmer, any questions, help, or just talk of any language will be welcomed here.

Moderator: Coders of Rage

Post Reply
corey__cakes
ES Beta Backer
ES Beta Backer
Posts: 23
Joined: Mon Dec 23, 2013 7:56 pm
Current Project: Canadian Simulator 2017
Favorite Gaming Platforms: GBA, DC, iOS, SNES, WS, 360, N64
Programming Language of Choice: C++, Lua
Location: Your VCR.

OpenGL Textures with SDL

Post by corey__cakes »

I've been learning how to program OpenGL with SDL in C++ and I've reached a point where my mind has been obliterated. I am no computer scientist so OpenGL is god-like powers for game development. What I'm trying to wrap my head around is the bits and bytes and depth size and buffer size. At least that is what I believe I don't understand. Let me explain my problem. I tried to load in and display a texture loaded by SDL as an SDL_Surface*. To my amazement, it works! Kind of... it loads the image, displays it on a quadrant but the colors aren't correct. The image is just 5 squiggly lines in red, green, blue, yellow, and rosey pink with a white background. When it is displayed in the program, the blue is red, the red is blue, the yellow is cyan, the pink is a washed out lavender, and the green is a lighter shade of green.

I don't even know where to begin with debugging this. Is it the image format (24 bit bitmap)? Is it a problem with the flags I passed in glTexImage2D? This is what I have for the texture. The surface is called img and the texture is textur

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, img->w, img->h, 0, GL_RGB, GL_UNSIGNED_BYTE, img->pixels);

Someone please just tell me what's happening. Do I need to use an alternative texture loader? I'd like to use SDL as much as possible. Do I need to explain my code more?
What do you call a cow with no legs?

Ground beef.
User avatar
bbguimaraes
Chaos Rift Junior
Chaos Rift Junior
Posts: 294
Joined: Wed Apr 11, 2012 4:34 pm
Programming Language of Choice: c++
Location: Brazil
Contact:

Re: OpenGL Textures with SDL

Post by bbguimaraes »

Not an opengl expert either, but you have to make sure to pass the right format to the glTexImage function. It all depends on the way you're loading the image data, but since you didn't mention that I can't help you much. Make sure you understand what the parameters are (docs) and check the documentation of the library you're using, it should mention the format the image data is stored in memory when it's loaded.
User avatar
dandymcgee
ES Beta Backer
ES Beta Backer
Posts: 4709
Joined: Tue Apr 29, 2008 3:24 pm
Current Project: https://github.com/dbechrd/RicoTech
Favorite Gaming Platforms: NES, Sega Genesis, PS2, PC
Programming Language of Choice: C
Location: San Francisco
Contact:

Re: OpenGL Textures with SDL

Post by dandymcgee »

Definitely a texture loading issue. Your bytes are either ordered incorrectly or aligned incorrectly.

Try replacing GL_RGB with GL_RGB8. If that doesn't work, refer to the docs linked above.
Falco Girgis wrote:It is imperative that I can broadcast my narcissistic commit strings to the Twitter! Tweet Tweet, bitches! :twisted:
User avatar
ChrisGorsky
ES Beta Backer
ES Beta Backer
Posts: 7
Joined: Tue Jan 13, 2015 9:37 pm
Programming Language of Choice: C++
Location: Long Island, New York

Re: OpenGL Textures with SDL

Post by ChrisGorsky »

Not sure if this will help you but I found this old segment of code I wrote to create OpenGL textures from SDL surfaces.

Code: Select all

   
    SDL_Surface* surface = IMG_Load(path);    
    GLuint textureId;

    glGenTextures(1, &textureId);
    glBindTexture(GL_TEXTURE_2D, textureId);

    int mode = (surface->format->BytesPerPixel == 4) ? GL_RGBA : GL_RGB;
    glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,surface->w,surface->h,0,
                     mode,GL_UNSIGNED_BYTE,surface->pixels);

    SDL_FreeSurface(surface);
    glDisable(GL_TEXTURE_2D);
Programmer, Tae Kwon Do enthusiast, fashion expert.
corey__cakes
ES Beta Backer
ES Beta Backer
Posts: 23
Joined: Mon Dec 23, 2013 7:56 pm
Current Project: Canadian Simulator 2017
Favorite Gaming Platforms: GBA, DC, iOS, SNES, WS, 360, N64
Programming Language of Choice: C++, Lua
Location: Your VCR.

Re: OpenGL Textures with SDL

Post by corey__cakes »

I tried using every possible flag, which probably isn't good, and GL_BGR WORKS. The question is why does it work? Do I need to read the red book of OpenGL in order to fully understand why?
What do you call a cow with no legs?

Ground beef.
User avatar
bbguimaraes
Chaos Rift Junior
Chaos Rift Junior
Posts: 294
Joined: Wed Apr 11, 2012 4:34 pm
Programming Language of Choice: c++
Location: Brazil
Contact:

Re: OpenGL Textures with SDL

Post by bbguimaraes »

That is... strange. I'd say your image loading code (whether it's your own or a library) is loading the image data incorrectly, or at least in a strange way. If you are not familiar with how image data is stored, I recommend this recent video from computerphile.

GL_BGR is basically the same as GL_RGB, but the values are swapped. So the blue channel becomes the red, the red becomes the blue and the green remains unchanged. If you think about it, it fits your situation:

Code: Select all

When it is displayed in the program, the blue is red,
R: 0 G: 0 B: 1
B: 1 G: 0 R: 0

the red is blue,
R: 1 G: 0 B: 0
B: 0 G: 0 R: 1

the yellow is cyan,
R: 1 G: 1 B: 0
B: 0 G: 1 R: 1
So the problem is this mismatch between the image data format and the format you're telling opengl the image data is. Using GL_BGR here is, IMHO, a hack. You should fix the image loading code to load images in the GL_RGB format so you don't have to do this translation when passing it to opengl.
User avatar
bbguimaraes
Chaos Rift Junior
Chaos Rift Junior
Posts: 294
Joined: Wed Apr 11, 2012 4:34 pm
Programming Language of Choice: c++
Location: Brazil
Contact:

Re: OpenGL Textures with SDL

Post by bbguimaraes »

Just for fun, I swapped the channels on a random image to complement my explanation. I was actually surprised I could do that with a image manipulation software (gimp). Here is the original and here is the swapped.
User avatar
ChrisGorsky
ES Beta Backer
ES Beta Backer
Posts: 7
Joined: Tue Jan 13, 2015 9:37 pm
Programming Language of Choice: C++
Location: Long Island, New York

Re: OpenGL Textures with SDL

Post by ChrisGorsky »

I ran into the same issue, for me it depended on if I was on my Mac or PC this was my solution:

Code: Select all

#ifdef __APPLE__
#define IMG_MODE GL_BGR
#define IMG_MODE_A GL_BGRA
#endif

#ifdef _WIN32
#define IMG_MODE GL_RGB
#define IMG_MODE_A GL_RGBA
#endif

//Called in the texture load function
int mode = (surface->format->BytesPerPixel == 4) ? IMG_MODE_A : IMG_MODE;
Programmer, Tae Kwon Do enthusiast, fashion expert.
User avatar
dandymcgee
ES Beta Backer
ES Beta Backer
Posts: 4709
Joined: Tue Apr 29, 2008 3:24 pm
Current Project: https://github.com/dbechrd/RicoTech
Favorite Gaming Platforms: NES, Sega Genesis, PS2, PC
Programming Language of Choice: C
Location: San Francisco
Contact:

Re: OpenGL Textures with SDL

Post by dandymcgee »

ChrisGorsky wrote:I ran into the same issue, for me it depended on if I was on my Mac or PC this was my solution:
That sounds like some endianess bullshit.
Falco Girgis wrote:It is imperative that I can broadcast my narcissistic commit strings to the Twitter! Tweet Tweet, bitches! :twisted:
Post Reply