OpenGL Fullscreen Crashing

Whether you're a newbie or an experienced programmer, any questions, help, or just talk of any language will be welcomed here.

Moderator: Coders of Rage

RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

OpenGL Fullscreen Crashing

Post by RandomDever »

OpenGL seems to crash every time I toggle fullscreen.
Is there some GL function I need to call before setting up the screen again?
BTW here's my code:

Code: Select all

if( screen->flags & SDL_FULLSCREEN )
{
	Setup( screen->w, screen->h, screen->format->BitsPerPixel, screen->flags ^ SDL_FULLSCREEN );
}
else
{
	Setup( screen->w, screen->h, screen->format->BitsPerPixel, screen->flags | SDL_FULLSCREEN );
}
And the Setup() code:

Code: Select all

if( ( screen = SDL_SetVideoMode( width, height, bpp, flags ) ) == NULL )
{
	//Print Error
}

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
	 
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, 32 );
	 
SDL_GL_SetAttribute( SDL_GL_ACCUM_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_ALPHA_SIZE, 8 );

glClearColor( 0.75F, 0.75F, 1.0F, 0 );
glClearDepth( 1.0f );
	 
glViewport( 0, 0, width, height );
glMatrixMode( GL_PROJECTION );
glLoadIdentity();
glOrtho( 0, width, height, 0, 1, -1 );
glMatrixMode( GL_MODELVIEW );
glEnable( GL_TEXTURE_2D );
glLoadIdentity();
NOTE: I'm using SDL for window management.
Any help is GREATLY appreciated. :)
User avatar
Nokurn
Chaos Rift Regular
Chaos Rift Regular
Posts: 164
Joined: Mon Jan 31, 2011 12:08 pm
Favorite Gaming Platforms: PC, SNES, Dreamcast, PS2, N64
Programming Language of Choice: Proper C++
Location: Southern California
Contact:

Re: OpenGL Fullscreen Crashing

Post by Nokurn »

You need to define "crash." Are you getting an error message? Where does the problem crop up if you step through the code with a debugger?

For starters you might want to try fixing your GL attributes. They must be set before SDL_SetVideoMode, as per SDL_GL_SetAttribute's documentation.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

Nokurn wrote:You need to define "crash." Are you getting an error message? Where does the problem crop up if you step through the code with a debugger?

For starters you might want to try fixing your GL attributes. They must be set before SDL_SetVideoMode, as per SDL_GL_SetAttribute's documentation.
OK so firstly I believe it was Access violation reading 0x000005 or something like that.
And secondly... It was due to my failure at putting the attributes BEFORE the SetVideoMode so thank you Nokurn for that.
I pretty much copied this code from an article online so that didn't really work. :roll:
But anyhow, since you seem to know more than me about OpenGL here's another question:
What is the most efficient way to deal with texture reloading?
Because when I go into fullscreen my quads lose there textures and if I remember correctly that is supposed to happen and you just have to load them back in.
So is it better to do so manually?
Or by using a <list> that keeps a record of every texture it loads and when it goes into fullscreen it automatically reloads the textures?
Thank you. :)
User avatar
Nokurn
Chaos Rift Regular
Chaos Rift Regular
Posts: 164
Joined: Mon Jan 31, 2011 12:08 pm
Favorite Gaming Platforms: PC, SNES, Dreamcast, PS2, N64
Programming Language of Choice: Proper C++
Location: Southern California
Contact:

Re: OpenGL Fullscreen Crashing

Post by Nokurn »

RandomDever wrote:What is the most efficient way to deal with texture reloading?
Usually you will want some form of central asset management. It can be something fancy, or it can be something really simple, like this:

Code: Select all

class TextureManager {
public:
    static TextureManager& getInstance()
    {
        static TextureManager instance;
        return instance;
    }

    ~TextureManager()
    {
        for (TextureMap::iterator i = textures_.begin(); i != textures_.end(); ) {
            delete *i;
            i = textures_.erase(i);
        }
    }

    Texture* getTexture(std::string const& name)
    {
        TextureMap::iterator i = textures_.find(name);
        if (i == textures_.end()) {
            Texture* texture = new Texture(name);
            textures_[name] = texture;
            return texture;
        }
        return i->second;
    }

    void reload()
    {
        for (TextureMap::iterator i = textures_.begin(); i != textures_.end(); ++i) {
            delete i->second;
            i->second = new Texture(i->first);
        }
    }

private:
    typedef std::map<std::string, Texture*> TextureMap;
    TextureMap textures_;

    TextureManager() {}
    TextureManager(TextureManager const&) {}
    TextureManager& operator=(TextureManager const&) {}
};
Of course, you'll probably want to use some form of lifetime management (reference counting is a popular choice), and I wouldn't advise using a singleton (maybe you'll want multiple texture pools). So the short answer is yes, maintain a list of textures that are currently in use, and reload them when you lose your OpenGL context (changing window modes, etc).

If you're interested in how other people implement asset management, I would definitely recommend looking at the Resources module in Jan Haller's Thor library.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

Nokurn wrote:
RandomDever wrote:What is the most efficient way to deal with texture reloading?
Usually you will want some form of central asset management. It can be something fancy, or it can be something really simple, like this:

Code: Select all

class TextureManager {
public:
    static TextureManager& getInstance()
    {
        static TextureManager instance;
        return instance;
    }

    ~TextureManager()
    {
        for (TextureMap::iterator i = textures_.begin(); i != textures_.end(); ) {
            delete *i;
            i = textures_.erase(i);
        }
    }

    Texture* getTexture(std::string const& name)
    {
        TextureMap::iterator i = textures_.find(name);
        if (i == textures_.end()) {
            Texture* texture = new Texture(name);
            textures_[name] = texture;
            return texture;
        }
        return i->second;
    }

    void reload()
    {
        for (TextureMap::iterator i = textures_.begin(); i != textures_.end(); ++i) {
            delete i->second;
            i->second = new Texture(i->first);
        }
    }

private:
    typedef std::map<std::string, Texture*> TextureMap;
    TextureMap textures_;

    TextureManager() {}
    TextureManager(TextureManager const&) {}
    TextureManager& operator=(TextureManager const&) {}
};
Of course, you'll probably want to use some form of lifetime management (reference counting is a popular choice), and I wouldn't advise using a singleton (maybe you'll want multiple texture pools). So the short answer is yes, maintain a list of textures that are currently in use, and reload them when you lose your OpenGL context (changing window modes, etc).

If you're interested in how other people implement asset management, I would definitely recommend looking at the Resources module in Jan Haller's Thor library.
Would it be ill advised to build the asset manager directly into my video manager? ( seeing as all textures are loaded through it )
Because I want this to be my final engine. ( And by final I mean something I can use for a longer period of time than one project, as the rest of my engines have been. )
And my final engine I want to be efficient and clean.
That's why I spent a while getting the debug and engine classes so they would be able to be accessed anywhere.
Anyway... Tangent... :)
User avatar
Falco Girgis
Elysian Shadows Team
Elysian Shadows Team
Posts: 10294
Joined: Thu May 20, 2004 2:04 pm
Current Project: Elysian Shadows
Favorite Gaming Platforms: Dreamcast, SNES, NES
Programming Language of Choice: C/++
Location: Studio Vorbis, AL
Contact:

Re: OpenGL Fullscreen Crashing

Post by Falco Girgis »

Make sure that you are deallocating and reallocating all textures if you are dynamically changing video resolutions. You can run into some obscure bullshit otherwise.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

Falco Girgis wrote:Make sure that you are deallocating and reallocating all textures if you are dynamically changing video resolutions. You can run into some obscure bullshit otherwise.
Okay I will. Thanks. :)
User avatar
Nokurn
Chaos Rift Regular
Chaos Rift Regular
Posts: 164
Joined: Mon Jan 31, 2011 12:08 pm
Favorite Gaming Platforms: PC, SNES, Dreamcast, PS2, N64
Programming Language of Choice: Proper C++
Location: Southern California
Contact:

Re: OpenGL Fullscreen Crashing

Post by Nokurn »

RandomDever wrote:Would it be ill advised to build the asset manager directly into my video manager?
It depends on what exactly your video manager does. If you're loading textures through it, it probably is already some form of asset manager and you just need to extend it.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

Nokurn wrote:
RandomDever wrote:Would it be ill advised to build the asset manager directly into my video manager?
It depends on what exactly your video manager does. If you're loading textures through it, it probably is already some form of asset manager and you just need to extend it.
Here's a list of all the functions:

Code: Select all

Setup //Sets up a window
SetCaption //Sets the caption of the window
LoadTexture //Loads a texture from a png
DeleteTexture //Deletes a texture
ReloadTexture //Deletes and reloads a texture
BlitTexture //Renders a texture on the screen to a quad
Clear //Clears the screen
Render //Swaps the buffers
Resize //Resizes the window
ToggleFullscreen //Self-explanatory
IsFullscreen //Returns true if screen->flags & SDL_FULLSCREEN
Thats it right now, but it is early in development.
User avatar
Nokurn
Chaos Rift Regular
Chaos Rift Regular
Posts: 164
Joined: Mon Jan 31, 2011 12:08 pm
Favorite Gaming Platforms: PC, SNES, Dreamcast, PS2, N64
Programming Language of Choice: Proper C++
Location: Southern California
Contact:

Re: OpenGL Fullscreen Crashing

Post by Nokurn »

RandomDever wrote:
Nokurn wrote:
RandomDever wrote:Would it be ill advised to build the asset manager directly into my video manager?
It depends on what exactly your video manager does. If you're loading textures through it, it probably is already some form of asset manager and you just need to extend it.
Here's a list of all the functions:

Code: Select all

Setup //Sets up a window
SetCaption //Sets the caption of the window
LoadTexture //Loads a texture from a png
DeleteTexture //Deletes a texture
ReloadTexture //Deletes and reloads a texture
BlitTexture //Renders a texture on the screen to a quad
Clear //Clears the screen
Render //Swaps the buffers
Resize //Resizes the window
ToggleFullscreen //Self-explanatory
IsFullscreen //Returns true if screen->flags & SDL_FULLSCREEN
Thats it right now, but it is early in development.
Personally, I would part that class out (see the single responsibility principle). It could be split into classes like Renderer and TextureManager. Also, if you're using OpenGL seriously, you probably shouldn't have a function named "BlitTexture" anywhere in your code. Unless you have a very specific reason for targeting video cards that lack support for VBOs (OpenGL 1.5) or VAOs (OpenGL 3.0), you shouldn't be rendering in immediate mode. You'll get MUCH better performance by using VBOs instead; this may or may not be an issue for your game, depending on how many quads you'll be rendering at once in the final version. If you're planning on reusing your code, however, I'd definitely recommend implementing a better system for rendering quads.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

Nokurn wrote:
RandomDever wrote:
Nokurn wrote:
RandomDever wrote:Would it be ill advised to build the asset manager directly into my video manager?
It depends on what exactly your video manager does. If you're loading textures through it, it probably is already some form of asset manager and you just need to extend it.
Here's a list of all the functions:

Code: Select all

Setup //Sets up a window
SetCaption //Sets the caption of the window
LoadTexture //Loads a texture from a png
DeleteTexture //Deletes a texture
ReloadTexture //Deletes and reloads a texture
BlitTexture //Renders a texture on the screen to a quad
Clear //Clears the screen
Render //Swaps the buffers
Resize //Resizes the window
ToggleFullscreen //Self-explanatory
IsFullscreen //Returns true if screen->flags & SDL_FULLSCREEN
Thats it right now, but it is early in development.
Personally, I would part that class out (see the single responsibility principle). It could be split into classes like Renderer and TextureManager. Also, if you're using OpenGL seriously, you probably shouldn't have a function named "BlitTexture" anywhere in your code. Unless you have a very specific reason for targeting video cards that lack support for VBOs (OpenGL 1.5) or VAOs (OpenGL 3.0), you shouldn't be rendering in immediate mode. You'll get MUCH better performance by using VBOs instead; this may or may not be an issue for your game, depending on how many quads you'll be rendering at once in the final version. If you're planning on reusing your code, however, I'd definitely recommend implementing a better system for rendering quads.
I may separate those out then.
But what is 'immediate mode'? I looked it up and it seems to be simply not using something called 'display lists'.
I don't know what those are, but I will probably implement it because it apparently is alot more efficient.
BTW the whole 'BlitTexture' thing is just what I call it because that's what I'm used too.
It may be the wrong thing to call it but IDK.
I want this to be as good of an engine as can be so thank you for all your advice. :)
User avatar
Nokurn
Chaos Rift Regular
Chaos Rift Regular
Posts: 164
Joined: Mon Jan 31, 2011 12:08 pm
Favorite Gaming Platforms: PC, SNES, Dreamcast, PS2, N64
Programming Language of Choice: Proper C++
Location: Southern California
Contact:

Re: OpenGL Fullscreen Crashing

Post by Nokurn »

RandomDever wrote:
Nokurn wrote:
RandomDever wrote:
Nokurn wrote:
RandomDever wrote:Would it be ill advised to build the asset manager directly into my video manager?
It depends on what exactly your video manager does. If you're loading textures through it, it probably is already some form of asset manager and you just need to extend it.
Here's a list of all the functions:

Code: Select all

Setup //Sets up a window
SetCaption //Sets the caption of the window
LoadTexture //Loads a texture from a png
DeleteTexture //Deletes a texture
ReloadTexture //Deletes and reloads a texture
BlitTexture //Renders a texture on the screen to a quad
Clear //Clears the screen
Render //Swaps the buffers
Resize //Resizes the window
ToggleFullscreen //Self-explanatory
IsFullscreen //Returns true if screen->flags & SDL_FULLSCREEN
Thats it right now, but it is early in development.
Personally, I would part that class out (see the single responsibility principle). It could be split into classes like Renderer and TextureManager. Also, if you're using OpenGL seriously, you probably shouldn't have a function named "BlitTexture" anywhere in your code. Unless you have a very specific reason for targeting video cards that lack support for VBOs (OpenGL 1.5) or VAOs (OpenGL 3.0), you shouldn't be rendering in immediate mode. You'll get MUCH better performance by using VBOs instead; this may or may not be an issue for your game, depending on how many quads you'll be rendering at once in the final version. If you're planning on reusing your code, however, I'd definitely recommend implementing a better system for rendering quads.
I may separate those out then.
But what is 'immediate mode'? I looked it up and it seems to be simply not using something called 'display lists'.
I don't know what those are, but I will probably implement it because it apparently is alot more efficient.
BTW the whole 'BlitTexture' thing is just what I call it because that's what I'm used too.
It may be the wrong thing to call it but IDK.
I want this to be as good of an engine as can be so thank you for all your advice. :)
You are using "immediate mode" whenever you send the vertices you want to be rendered when you want them to be rendered. It's bad for a number of reasons, but the primary reason is that you can't render until your vertices (and other data) are done being sent to the video card. This takes time, and it's really inefficient to do every single frame. You can avoid this in OpenGL by using vertex buffer objects, which allow you to pre-load your vertices (and other data).

Display lists are an improvement, and are probably the simplest optimization to implement, but in my experience, they don't quite reach the level of performance or flexibility that you get from a well-tuned VBO system. With display lists, you define sequences of OpenGL calls that are stored on the GPU, which can later be executed directly on the GPU, therefore avoiding the overhead involved in sending each command to the GPU in real time. Display lists can only call a limited subset of OpenGL.

I assume that you are using immediate mode in a function called BlitTexture because the name suggests that it operates directly on the Texture and has no existing knowledge of the geometry. With no knowledge of the geometry, the geometry must be generated on-the-fly, and rendered immediately.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

Nokurn wrote:
RandomDever wrote:
Nokurn wrote:
RandomDever wrote:
Nokurn wrote:
RandomDever wrote:Would it be ill advised to build the asset manager directly into my video manager?
It depends on what exactly your video manager does. If you're loading textures through it, it probably is already some form of asset manager and you just need to extend it.
Here's a list of all the functions:

Code: Select all

Setup //Sets up a window
SetCaption //Sets the caption of the window
LoadTexture //Loads a texture from a png
DeleteTexture //Deletes a texture
ReloadTexture //Deletes and reloads a texture
BlitTexture //Renders a texture on the screen to a quad
Clear //Clears the screen
Render //Swaps the buffers
Resize //Resizes the window
ToggleFullscreen //Self-explanatory
IsFullscreen //Returns true if screen->flags & SDL_FULLSCREEN
Thats it right now, but it is early in development.
Personally, I would part that class out (see the single responsibility principle). It could be split into classes like Renderer and TextureManager. Also, if you're using OpenGL seriously, you probably shouldn't have a function named "BlitTexture" anywhere in your code. Unless you have a very specific reason for targeting video cards that lack support for VBOs (OpenGL 1.5) or VAOs (OpenGL 3.0), you shouldn't be rendering in immediate mode. You'll get MUCH better performance by using VBOs instead; this may or may not be an issue for your game, depending on how many quads you'll be rendering at once in the final version. If you're planning on reusing your code, however, I'd definitely recommend implementing a better system for rendering quads.
I may separate those out then.
But what is 'immediate mode'? I looked it up and it seems to be simply not using something called 'display lists'.
I don't know what those are, but I will probably implement it because it apparently is alot more efficient.
BTW the whole 'BlitTexture' thing is just what I call it because that's what I'm used too.
It may be the wrong thing to call it but IDK.
I want this to be as good of an engine as can be so thank you for all your advice. :)
You are using "immediate mode" whenever you send the vertices you want to be rendered when you want them to be rendered. It's bad for a number of reasons, but the primary reason is that you can't render until your vertices (and other data) are done being sent to the video card. This takes time, and it's really inefficient to do every single frame. You can avoid this in OpenGL by using vertex buffer objects, which allow you to pre-load your vertices (and other data).

Display lists are an improvement, and are probably the simplest optimization to implement, but in my experience, they don't quite reach the level of performance or flexibility that you get from a well-tuned VBO system. With display lists, you define sequences of OpenGL calls that are stored on the GPU, which can later be executed directly on the GPU, therefore avoiding the overhead involved in sending each command to the GPU in real time. Display lists can only call a limited subset of OpenGL.

I assume that you are using immediate mode in a function called BlitTexture because the name suggests that it operates directly on the Texture and has no existing knowledge of the geometry. With no knowledge of the geometry, the geometry must be generated on-the-fly, and rendered immediately.
Yes 'BlitTexture', based on your description, does use immediate mode.
It would be fairly easy to make it use a display list, but I'll have to do some research on VBOs, after I get some sleep.
Because essentially I am building this engine to be as fast as fast as possible, while trying to maintain some ease of use. ( As well as finally implementing a bunch of cool debug features to make it a lot easier to find bugs )
And also to make a 2D platformer. :)
User avatar
Nokurn
Chaos Rift Regular
Chaos Rift Regular
Posts: 164
Joined: Mon Jan 31, 2011 12:08 pm
Favorite Gaming Platforms: PC, SNES, Dreamcast, PS2, N64
Programming Language of Choice: Proper C++
Location: Southern California
Contact:

Re: OpenGL Fullscreen Crashing

Post by Nokurn »

RandomDever wrote:Yes 'BlitTexture', based on your description, does use immediate mode.
It would be fairly easy to make it use a display list, but I'll have to do some research on VBOs, after I get some sleep.
Because essentially I am building this engine to be as fast as fast as possible, while trying to maintain some ease of use. ( As well as finally implementing a bunch of cool debug features to make it a lot easier to find bugs )
And also to make a 2D platformer. :)
Keep in mind that premature optimization can often cause more problems than it solves. In this case, however, I don't think you will regret implementing a proper renderer.

Also, try to focus on the game aspect of the project. Too often people start out to write a game, then start writing an engine, then get sidetracked by the engine. Many engines grow naturally as you develop a game. Once the game is done, you can pull out the engine code and then work on it as an engine. Or, you could incorporate it in another game, let it grow, and continue the cycle. You'll end up with a much more mature engine than if you specifically wrote an engine.
RandomDever
Chaos Rift Regular
Chaos Rift Regular
Posts: 198
Joined: Thu Mar 26, 2009 8:42 pm
Current Project: My Engine
Programming Language of Choice: C++

Re: OpenGL Fullscreen Crashing

Post by RandomDever »

OK so I've looked a bit at VBOs and I have this pseudo-code:

Code: Select all

GLuint colorBuffer = NULL;
GLuint positionBuffer = NULL;

GLubyte colors[] = { 255, 255, 255, 128 };

glGenBuffers( 1, &colors );
glBindBuffer( GL_ARRAY_BUFFER, colorBuffer );
glBufferData( GL_ARRAY_BUFFER, 4 * sizeof(GLubyte) );
glColorPointer( 4, GL_UNSIGNED_BYTE, 0, 0 );

GLfloat positions[] = { -1.0F, -1.0F, 1.0F, -1.0F, 1.0F, 1.0F, -1.0F, 1.0F };

glGenBuffers( 1, &positions );
glBindBuffer( GL_ARRAY_BUFFER, positionBuffer );
glBufferData( GL_ARRAY_BUFFER, 8 * sizeof(GLfloat) );
glVertexPointer( 2, GL_FLOAT, 0, 0 );

glEnableClientState( GL_VERTEX_ARRAY );
glEnableClientState( GL_COLOR_ARRAY );

glDrawArrays( GL_QUADS, 0, 4 ); //Not entirely sure how this works

glDisableClientState( GL_COLOR_ARRAY );
glDisableClientState( GL_VERTEX_ARRAY ); 
I don't exactly know what the third parameter of 'glDrawArrays' does, but besides that, is this code correct?
Also the example I drew from to make the above code draws a square with 2 triangles ( for some reason )
but it doesn't have a texture so how would I do that?
Post Reply