Linux console display corrupted after exiting SDL application

Hello,

I wrote a very simple application that uses SDL. The applicaiton works great (thanks to SDL!), but after the application exits the original text console display is corrupted.

I’m using SDL with a LFS build (6.2) so there’s no X11… just kernel support for framebuffer access.

After the application exits, the console text output appears as little tiny pixels near the top of the screen (as if the entire console is packed into about 20 pixel rows). Each text character appears as single pixel (or something close to it). The system is still fully functional (I can enter console commands and the system responds).

It’s like the video mode that existed prior to executing my application is not restored properly.

My console display is not corrupted if I run a simple application that calls SDL_Init() followed by SDL_Quit(). In other words I have to call SDL_SetVideoMode() in order to realize the screen corruption when the application exits.

Any ideas on this one? I think it’s something simple but I can’t seem to track it down.

I’m using linux kernel 2.6.16.27 with the framebuffer driver enabled as a module (along with ATIRadeon new support). My version of SDL is 1.2.11 (latest stable) and was compiled on my platform from source.

Any help is appreciated!

Kind Regards,

Jim

Do you have the “fbset” utility? Before running your application, play around with it and see if you can
change your resolution/bpp and timings on the console. Then after your display is corrupted, try
manually restoring your resolution with this tool. If this works, then I would assume it’s a bug in sdl.

Can you run the program a second time without corruption? I haven’t used the radeon driver after 2.4.14.

SeanOn Wed, Jan 24, 2007 at 10:19:34AM -0800, Jim wrote:

Hello,

I wrote a very simple application that uses SDL. The applicaiton works great (thanks to SDL!), but after the application exits the original text console display is corrupted.

I’m using SDL with a LFS build (6.2) so there’s no X11… just kernel support for framebuffer access.

After the application exits, the console text output appears as little tiny pixels near the top of the screen (as if the entire console is packed into about 20 pixel rows). Each text character appears as single pixel (or something close to it). The system is still fully functional (I can enter console commands and the system responds).

It’s like the video mode that existed prior to executing my application is not restored properly.

After the application exits, the console text output appears as little tiny pixels near the top of the screen (as if the entire console is packed into about 20 pixel rows). Each text character appears as single pixel (or something close to it). The system is still fully functional (I can enter console commands and the system responds).

SDL_Quit() is getting called, right?

It’s possible we broke this, though. fbcon doesn’t get a lot of
attention, really.

–ryan.

Hi Sean,

I can run the application after the corruption without issue. When the application executes, the display renders perfectly via the SDL calls. After exit from the application the display returns to the corrupted “console” version with the same visual effects. I can rerun the application with the same results (good display during execution, bad after exit).

I have an “atexit(SDL_Quit)” line in the code which should guarantee the proper termination call. I do not free the surface that was returned from the SDL_SetVideoMode() function… there’s a line in the man page that says “don’t” because SDL_Quit will take care of it.

I don’t have the fbset application. I’ll track down the source build it… more results to follow :-).

Sean D’Epagnier wrote: On Wed, Jan 24, 2007 at 10:19:34AM -0800, Jim wrote:

Hello,

I wrote a very simple application that uses SDL. The applicaiton works great (thanks to SDL!), but after the application exits the original text console display is corrupted.

I’m using SDL with a LFS build (6.2) so there’s no X11… just kernel support for framebuffer access.

After the application exits, the console text output appears as little tiny pixels near the top of the screen (as if the entire console is packed into about 20 pixel rows). Each text character appears as single pixel (or something close to it). The system is still fully functional (I can enter console commands and the system responds).

It’s like the video mode that existed prior to executing my application is not restored properly.

Do you have the “fbset” utility? Before running your application, play around with it and see if you can
change your resolution/bpp and timings on the console. Then after your display is corrupted, try
manually restoring your resolution with this tool. If this works, then I would assume it’s a bug in sdl.

Can you run the program a second time without corruption? I haven’t used the radeon driver after 2.4.14.

Sean_______________________________________________
SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hello !

I have an “atexit(SDL_Quit)” line in the code which should guarantee the
proper termination call. I do not free the surface that was returned
from the SDL_SetVideoMode() function… there’s a line in the man page
that says “don’t” because SDL_Quit will take care of it.

You should remove the atexit line
and call SDL_Quit when your app quits.

CU

More updates…

I’m setting the video mode to 1024 x 768. For some strange reason only 3 modes are offered when inspecting the mode list. The other two are obscure resolutions (I’d type them in but I don’t recall exactly what they were).

I’ve determined that the resolution after SDL_Quit() is 4x larger in both horizontal and vertical dimensions. So that would be 4096 x 3072. Very strange.

In some failures I can see my original 1024x768 screen repeated 4 times across the top of the screen. Usually I get a very small representation of the my original console text display… but with only a pixel or two for each text character.

Is there a way to force the screen resolution that is set by SDL upon completion of SDL_Quit() ?

Kind Regards,

Jim

“Ryan C. Gordon” wrote:

After the application exits, the console text output appears as little tiny pixels near the top of the screen (as if the entire console is packed into about 20 pixel rows). Each text character appears as single pixel (or something close to it). The system is still fully functional (I can enter console commands and the system responds).

SDL_Quit() is getting called, right?

It’s possible we broke this, though. fbcon doesn’t get a lot of
attention, really.

–ryan._______________________________________________
SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

I’ve determined that the resolution after SDL_Quit() is 4x larger in
both horizontal and vertical dimensions. So that would be 4096 x 3072.
Very strange.

Also, can you post a small program that triggers this bug? Or will this
alone do it:

#include "SDL.h"
int main(int argc, char **argv)
{
    SDL_Init(SDL_INIT_VIDEO);
    SDL_SetVideoMode(1024, 768, 0, SDL_FULLSCREEN);
    SDL_Quit();
    return 0;
}

Thanks,
–ryan.

Hi,

Here’s the exact code that will duplicate the problem:

int main(int argc, char**argv)
{
SDL_Init(SDL_INIT_VIDEO);
SDL_SetVideoMode(1024, 768, 32, SDL_FULLSCREEN | SDL_ANYFORMAT | SDL_HWSURFACE | SDL_DOUBLEBUF);
SDL_Quit();
return 0;
}

Here’s output of “dmesg | grep radeon”:

radeonfb: Found Intel x86 BIOS ROM Image
radeonfb: Retrieved PLL infos from BIOS
radeonfb: Reference=27.00 MHz (RefDiv=12) Memory=378.00 Mhz, System=338.00 MHz
radeonfb: PLL min 20000 max 40000
radeonfb: Monitor 1 type CRT found
radeonfb: EDID probed
radeonfb: Monitor 2 type no found
radeonfb (0000:01:00.0): ATI Radeon NH

I’m using linux kernel 2.6.16.27 (built using LFS 6.2 and jhalfs). I’ve added a few items from blfs like nfs + dependencies, along with inetd servers.

Like always, any help is appreciated… now onto my V4L Hauppauge ImpactVCB issue :-).

Kind Regards,

Jim

“Ryan C. Gordon” wrote:

I’ve determined that the resolution after SDL_Quit() is 4x larger in
both horizontal and vertical dimensions. So that would be 4096 x 3072.
Very strange.

Also, can you post a small program that triggers this bug? Or will this
alone do it:

#include "SDL.h"
int main(int argc, char **argv)
{
    SDL_Init(SDL_INIT_VIDEO);
    SDL_SetVideoMode(1024, 768, 0, SDL_FULLSCREEN);
    SDL_Quit();
    return 0;
}

Thanks,
–ryan._______________________________________________
SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Here’s the exact code that will duplicate the problem:

Thanks! I’ll take a look. It’ll be a few days until I can play with
this, though, so if you have an urgent need, you might have to dig in
yourself.

In the meantime, first thing to do is start dropping flags from
SetVideoMode() and see if, say, SDL_DOUBLEBUF is a buggy codepath.

–ryan.

Thanks! I’ll take a look. It’ll be a few days until I can play with
this, though, so if you have an urgent need, you might have to dig in
yourself.

Also, I’ve added this issue to the bug tracker so it isn’t forgotten:
http://bugzilla.libsdl.org/show_bug.cgi?id=384

–ryan.