Window management (SDL 1.3)

Hi again,

below is another bit of (arguably pointless) code that I would expect to
do the following:

  1. create a window (and hide it)
  2. show that window after 2 seconds
  3. hide that window after yet another 2 seconds
  4. destroy the window and wait 2 seconds.
  5. shut down sdl

Compiling this with SDL 1.3, zipped distribution, revision 4948, on a
Linux Ubuntu 10.04 running a 2.6.32-27-generic kernel and nvidia driver
195.36.24, the SDL_ShowWindow and SDL_HideWindow functions have
absolutely no effect (other than setting the window flags appropriately,
that is). Strangely enough, even after calling SDL_DestroyWindow, the
window remains visible (is actually also brought to the top of all
windows), and is only destroyed after SDL_Quit is called.

It seems that on Linux, SDL window management is still a little buggy.
Does anyone else here have these issues on Linux? As I do not have a
windows environment set up, could someone test what the code below
actually does on windows (perhaps also Mac)?

Thanks in advance. Here’s the code:

#include
#include <SDL/SDL.h>

struct Window
{
Window() : m_Window(0) {};
~Window() {}

bool create(int px,int py,int dx,int dy,Uint32 flags = 0)
{
bool rval = false;

 if (!m_Window)
 {
   m_Window = SDL_CreateWindow("",px,py,dx,dy,flags | 

SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
m_Context = SDL_GL_CreateContext(m_Window);

   if (m_Window && !(flags & SDL_WINDOW_SHOWN))
   {
     hide();
   }

   rval = (m_Window != 0);
 }

 return rval;

}

void destroy()
{
if (m_Window)
{
SDL_GL_DeleteContext(m_Context);
SDL_DestroyWindow(m_Window);
m_Window = 0;
std::cerr << “destroy(): done\n”;
}
}

void show()
{
if (m_Window)
{
SDL_ShowWindow(m_Window);
bool failed = (SDL_GetWindowFlags(m_Window) & SDL_WINDOW_SHOWN);
std::cerr << "show(): " << (failed ? “yes” : “no”) << std::endl;
}
}

void hide()
{
if (m_Window)
{
SDL_HideWindow(m_Window);
bool failed = !(SDL_GetWindowFlags(m_Window) & SDL_WINDOW_SHOWN);
std::cerr << "hide(): " << (failed ? “yes” : “no”) << std::endl;
}
}

SDL_Window * m_Window;
SDL_GLContext m_Context;
};

int main(int,char**)
{
if (SDL_Init(SDL_INIT_VIDEO) < 0)
{
exit(-1);
}

int posx = SDL_WINDOWPOS_CENTERED;
int posy = SDL_WINDOWPOS_CENTERED;
int winw = 320;
int winh = 200;

Window win;

win.create(posx,posy,winw,winh);
SDL_Delay(2000);
win.show();
SDL_Delay(2000);
win.hide();
SDL_Delay(2000);
win.destroy();
SDL_Delay(2000);

SDL_Quit();

// test if window is destroyed by SDL_Quit or program exit
//while (true) {}

return 0;
}

Quoth Matthias Schweinoch <matthias.schweinoch at gmx.de>, on 2011-01-18 22:31:48 +0100:

It seems that on Linux, SDL window management is still a little
buggy. Does anyone else here have these issues on Linux? As I do not
have a windows environment set up, could someone test what the code
below actually does on windows (perhaps also Mac)?

I’m hesitant to butt in without having read enough of the source or
tried any of this myself, but I strongly suspect you folks are running
into X being fundamentally a two-way stream underneath rather than
just synchronous procedure calls. Newbie X programmers often run into
this; requests such as those for making windows appear get buffered in
the client library unless you explicitly force them to be processed
now, much like with modern OpenGL rendering. There’s implicit syncs
at enough points that a program using X in a “normal” fashion is fine,
but one that queues up a few requests and then hangs and doesn’t give
the client library a chance to push them will run into trouble.

In an ideal world, I wouldn’t consider this a “bug” but rather a
platform difference that should be respected, but in practice I
suspect the most viable option might be to have SDL do more X flushes
if the above is true (otherwise more people will complain regardless).

Huff.

—> Drake Wilson

Is there anything wrong with SDL telling X to flush everytime it changes something about the window?------------------------
EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/

Quoth Nathaniel J Fries , on 2011-01-18 14:20:28 -0800:

Is there anything wrong with SDL telling X to flush everytime it changes something about the window?

The buffering is just for performance; if all X requests took zero
resources it wouldn’t matter ever. I doubt it would make a huge
difference if it were restricted to synchronous window configuration;
it’s possible there are other X interactions I haven’t thought of.
The application might take slightly longer to spin up. It depends on
the use case, but I suspect SDL on high-latency X is rare.

I’m not sure what the impact would be on other platforms; are there
platforms where it’s valuable to do those operations asynchronously?
If so, the alternative of telling people that SDL operations that talk
to the video backend may not happen until you do
might be worthwhile. In particular, I’m imagining it being a little
more consistent with the defined behavior of OpenGL.

Again, this is largely educated speculation since I’m not in the right
context to actually examine the source to see what it’s doing.

—> Drake Wilson

Maybe… make a new function “void SDL_X11_Flush(SDL_Window*)” or similar ?------------------------
EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/

Isn’t that exactly the sort of platform-specific implementation detail that SDL
is supposed to encapsulate?________________________________
From: nfries88@yahoo.com (Nathaniel J Fries)
Subject: Re: [SDL] Window management (SDL 1.3)

Maybe… make a new function “void SDL_X11_Flush(SDL_Window*)” or similar ?


EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/

One thing that has me a little worried is that the original post stated that after each command the program was told to wait two seconds. If this was a buffering issue inside of the window manager and two seconds was not enough time for the window manager to process it’s event queue (or whatever it has) wouldn’t this cause most windowed applications to have some severe lag?

If the program failed to consistently change the window state (as visible to the user) in a timely manner, I’d believe it to be a buffer and lag issue. But it sounds like these aren’t being processed at all. Or does X somehow tag some commands with a “process when, and only if, you feel like it”?

I think a SDL_X11_Flush(SDL_Window*) would be useful for testing, but I agree that it doesn’t follow the SDL encapsulation methodology.

Another possible solution might be to have something like SDL_WM_EnableFlush(SDL_Window*,int) (0=disable) which flags the flush function to be called after every SDL_WM_* function. Allowing a programmer to decide between snappy WM responses, or faster overall application speed (when SDL_WM_* is used)

Drake Wilson wrote:> Quoth Matthias Schweinoch <matthias.schweinoch at gmx.de>, on 2011-01-18 22:31:48 +0100:

It seems that on Linux, SDL window management is still a little
buggy. Does anyone else here have these issues on Linux? As I do not
have a windows environment set up, could someone test what the code
below actually does on windows (perhaps also Mac)?

I’m hesitant to butt in without having read enough of the source or
tried any of this myself, but I strongly suspect you folks are running
into X being fundamentally a two-way stream underneath rather than
just synchronous procedure calls. Newbie X programmers often run into
this; requests such as those for making windows appear get buffered in
the client library unless you explicitly force them to be processed
now, much like with modern OpenGL rendering. There’s implicit syncs
at enough points that a program using X in a “normal” fashion is fine,
but one that queues up a few requests and then hangs and doesn’t give
the client library a chance to push them will run into trouble.

In an ideal world, I wouldn’t consider this a “bug” but rather a
platform difference that should be respected, but in practice I
suspect the most viable option might be to have SDL do more X flushes
if the above is true (otherwise more people will complain regardless).

Huff.

—> Drake Wilson


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Quoth MBrening <micah.brening at gmail.com>, on 2011-01-18 19:19:49 -0800:

One thing that has me a little worried is that the original post
stated that after each command the program was told to wait two
seconds. If this was a buffering issue inside of the window manager
and two seconds was not enough time for the window manager to
process it’s event queue (or whatever it has) wouldn’t this cause
most windowed applications to have some severe lag?

The behavior I’m describing isn’t inside the window manager or the X
server. Neither of those have gotten to see your window management
request yet. This is part of the X client library. At least in the
historical days of Xlib, making an X call would just queue the command
internally until you did an XFlush, XSync, requesting input (think
SDL_PumpEvents), or something else that would prod the X library into
running its send and receive logic. If you’re sleeping, and you have
the control flow of the only thread, the X client library never gets
to run and the data just stays in your process forever.

Avoiding extra round-trips was really good back when an X terminal
might be running over a leased modem line, and can still be good for
networked X, but I don’t think it’s much of a concern for SDL.

I also don’t like the idea of making this sort of thing toggleable;
either SDL window management should act synchronously or it should act
asynchronously. The latter is nice in a way because it’s consistent
with OpenGL, which is commonly used alongside SDL. The former is nice
because it’s consistent with SDL 1.2 and is somewhat more predictable.
I tend to prefer synchronous window management in this context.

Incidentally, I have now looked at the source, though I haven’t tried
the test program above since I haven’t had time to try compiling SDL
1.3. SDL 1.2 from Debian does an XSync at the end of X11_SetVideoMode
in src/video/x11/SDL_x11video.c. SDL 1.3 from Mercurial doesn’t seem
to do anything similar for the fancier window implementation. So that
would lend credence to this being the cause of the behavior Matthias
observed.

—> Drake Wilson

Message-ID: <223499.54092.qm at web161206.mail.bf1.yahoo.com>
Content-Type: text/plain; charset=“us-ascii”

Maybe… make a new function “void SDL_X11_Flush(SDL_Window*)” or similar ?

Isn’t that exactly the sort of platform-specific implementation detail that SDL
is supposed to encapsulate?
I think this is simple enough to understand (and describe, and write
the code for) to justify it in the API, though something like
"SDL_FlushVideo" would be a better name.> Date: Tue, 18 Jan 2011 18:27:54 -0800 (PST)
From: Mason Wheeler
To: sdl at lists.libsdl.org
Subject: Re: [SDL] Window management (SDL 1.3)

From: Nathaniel J Fries
Subject: Re: [SDL] Window management (SDL 1.3)

Incidentally, I have now looked at the source, though I haven’t tried
the test program above since I haven’t had time to try compiling SDL
1.3. SDL 1.2 from Debian does an XSync at the end of X11_SetVideoMode
in src/video/x11/SDL_x11video.c. SDL 1.3 from Mercurial doesn’t seem
to do anything similar for the fancier window implementation. So that
would lend credence to this being the cause of the behavior Matthias
observed.

I hadn’t thought that it could be an X11 issue - precisely because for SDL 1.2, I never had to “bother” with it. It’s a valuable tip. I’ll add an xsync to my test program and see if that fixes my problem.

If this actually fixes it, then I would also favor a solution like SDL_VideoFlush or similar, which could then do whatever is appropriate for the plattform (i.e. nothing for those without asynchronous behaviour, and some kind of sync for those with).–
Neu: GMX De-Mail - Einfach wie E-Mail, sicher wie ein Brief!
Jetzt De-Mail-Adresse reservieren: http://portal.gmx.net/de/go/demail

It looks like the (missing?) XSync/XFlush really is the problem here: I
modified the SDL sources to provide a function SDL_VideoFlush, which for
x11 calls the XFlush function. Calling that SDL_VideoFlush() function
after changing window properties then actually does yield the
(requested) results.

It also explains why I only got the window to become visible after I
created an opengl context: The function X11_GL_CreateContext actually
performs an XSync.

Sam: Would you consider this issue a bug regarding the X11 platforms
(i.e. is the XSync/XFlush just missing from those X11 functions that
change the window state), or is this some performance related feature?
Would you consider it worthwhile adding a SDL_VideoFlush function to
force display synchronization on platforms with non-synchronous
displays? Another option (which would probably make it a little bit more
comfortable for programmers working on X11 platforms) might be to have
some kind of “force synchronous” flag/state for SDL, which, if set,
would cause the XSync or XFlush to be called.>> Incidentally, I have now looked at the source, though I haven’t tried

the test program above since I haven’t had time to try compiling SDL
1.3. SDL 1.2 from Debian does an XSync at the end of X11_SetVideoMode
in src/video/x11/SDL_x11video.c. SDL 1.3 from Mercurial doesn’t seem
to do anything similar for the fancier window implementation. So that
would lend credence to this being the cause of the behavior Matthias
observed.
I hadn’t thought that it could be an X11 issue - precisely because for SDL 1.2, I never had to “bother” with it. It’s a valuable tip. I’ll add an xsync to my test program and see if that fixes my problem.

If this actually fixes it, then I would also favor a solution like SDL_VideoFlush or similar, which could then do whatever is appropriate for the plattform (i.e. nothing for those without asynchronous behaviour, and some kind of sync for those with).

Yes, this would be a bug. The low level video functions are responsible for
making the changes visible as they are requested.

I’ll go ahead and take care of that. Thanks!On Wed, Jan 19, 2011 at 9:12 AM, Matthias Schweinoch < matthias.schweinoch at gmx.de> wrote:

It looks like the (missing?) XSync/XFlush really is the problem here: I
modified the SDL sources to provide a function SDL_VideoFlush, which for x11
calls the XFlush function. Calling that SDL_VideoFlush() function after
changing window properties then actually does yield the (requested) results.

It also explains why I only got the window to become visible after I
created an opengl context: The function X11_GL_CreateContext actually
performs an XSync.

Sam: Would you consider this issue a bug regarding the X11 platforms (i.e.
is the XSync/XFlush just missing from those X11 functions that change the
window state), or is this some performance related feature? Would you
consider it worthwhile adding a SDL_VideoFlush function to force display
synchronization on platforms with non-synchronous displays? Another option
(which would probably make it a little bit more comfortable for programmers
working on X11 platforms) might be to have some kind of "force synchronous"
flag/state for SDL, which, if set, would cause the XSync or XFlush to be
called.

Incidentally, I have now looked at the source, though I haven’t tried

the test program above since I haven’t had time to try compiling SDL
1.3. SDL 1.2 from Debian does an XSync at the end of X11_SetVideoMode
in src/video/x11/SDL_x11video.c. SDL 1.3 from Mercurial doesn’t seem
to do anything similar for the fancier window implementation. So that
would lend credence to this being the cause of the behavior Matthias
observed.

I hadn’t thought that it could be an X11 issue - precisely because for SDL
1.2, I never had to “bother” with it. It’s a valuable tip. I’ll add an xsync
to my test program and see if that fixes my problem.

If this actually fixes it, then I would also favor a solution like
SDL_VideoFlush or similar, which could then do whatever is appropriate for
the plattform (i.e. nothing for those without asynchronous behaviour, and
some kind of sync for those with).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


-Sam Lantinga, Founder and President, Galaxy Gameworks LLC

Oh, never mind: It seems that X11 provides this functionality out of the
box with XSynchronize. However, in the current SDL sources, XSynchronize
is only set to true for X11_DEBUG builds. I assume the default is for it
to be disabled.

Sam, would it be possible to expose the function to set the display to
synchronized via SDL?> Another option (which would probably make it a little bit more

comfortable for programmers working on X11 platforms) might be to have
some kind of “force synchronous” flag/state for SDL, which, if set,
would cause the XSync or XFlush to be called.

Okay, this is fixed in the repository:

BTW, you don’t have any error checking or message if creating the window
fails. :slight_smile:

See ya!On Tue, Jan 18, 2011 at 1:31 PM, Matthias Schweinoch < matthias.schweinoch at gmx.de> wrote:

Hi again,

below is another bit of (arguably pointless) code that I would expect to do
the following:

  1. create a window (and hide it)
  2. show that window after 2 seconds
  3. hide that window after yet another 2 seconds
  4. destroy the window and wait 2 seconds.
  5. shut down sdl

Compiling this with SDL 1.3, zipped distribution, revision 4948, on a Linux
Ubuntu 10.04 running a 2.6.32-27-generic kernel and nvidia driver 195.36.24,
the SDL_ShowWindow and SDL_HideWindow functions have absolutely no effect
(other than setting the window flags appropriately, that is). Strangely
enough, even after calling SDL_DestroyWindow, the window remains visible (is
actually also brought to the top of all windows), and is only destroyed
after SDL_Quit is called.

It seems that on Linux, SDL window management is still a little buggy. Does
anyone else here have these issues on Linux? As I do not have a windows
environment set up, could someone test what the code below actually does on
windows (perhaps also Mac)?

Thanks in advance. Here’s the code:

#include
#include <SDL/SDL.h>

struct Window
{
Window() : m_Window(0) {};
~Window() {}

bool create(int px,int py,int dx,int dy,Uint32 flags = 0)
{
bool rval = false;

if (!m_Window)
{
m_Window = SDL_CreateWindow("",px,py,dx,dy,flags | SDL_WINDOW_OPENGL |
SDL_WINDOW_SHOWN);
m_Context = SDL_GL_CreateContext(m_Window);

 if (m_Window && !(flags & SDL_WINDOW_SHOWN))
 {
   hide();
 }

 rval = (m_Window != 0);

}

return rval;
}

void destroy()
{
if (m_Window)
{
SDL_GL_DeleteContext(m_Context);
SDL_DestroyWindow(m_Window);
m_Window = 0;
std::cerr << “destroy(): done\n”;
}
}

void show()
{
if (m_Window)
{
SDL_ShowWindow(m_Window);
bool failed = (SDL_GetWindowFlags(m_Window) & SDL_WINDOW_SHOWN);
std::cerr << "show(): " << (failed ? “yes” : “no”) << std::endl;
}
}

void hide()
{
if (m_Window)
{
SDL_HideWindow(m_Window);
bool failed = !(SDL_GetWindowFlags(m_Window) & SDL_WINDOW_SHOWN);
std::cerr << "hide(): " << (failed ? “yes” : “no”) << std::endl;
}
}

SDL_Window * m_Window;
SDL_GLContext m_Context;
};

int main(int,char**)
{
if (SDL_Init(SDL_INIT_VIDEO) < 0)
{
exit(-1);
}

int posx = SDL_WINDOWPOS_CENTERED;
int posy = SDL_WINDOWPOS_CENTERED;
int winw = 320;
int winh = 200;

Window win;

win.create(posx,posy,winw,winh);
SDL_Delay(2000);
win.show();
SDL_Delay(2000);
win.hide();
SDL_Delay(2000);
win.destroy();
SDL_Delay(2000);

SDL_Quit();

// test if window is destroyed by SDL_Quit or program exit
//while (true) {}

return 0;
}


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


-Sam Lantinga, Founder and President, Galaxy Gameworks LLC

Sam; perhaps it would be better to do as was suggested and allow the programmer to decide whether or not to automatically synchronize with the X server?
I think that in the long term, SDL could possibly see some use in some distributed X environments where constant flushing/synchronization would be undesirable. Perhaps using a “SDL_EnableVideoFlush/SDL_DisableVideoFlush” API; with it defaulting to enabled?------------------------
EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/

I only added flushing for major window operations that are synchronous on
other platforms. Rendering remains the way it is. :)On Wed, Jan 19, 2011 at 1:38 PM, Nathaniel J Fries wrote:

Sam; perhaps it would be better to do as was suggested and allow the
programmer to decide whether or not to automatically synchronize with the X
server?
I think that in the long term, SDL could possibly see some use in some
distributed X environments where constant flushing/synchronization would be
undesirable. Perhaps using a "SDL_EnableVideoFlush/SDL_DisableVideoFlush"
API; with it defaulting to enabled?


EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


-Sam Lantinga, Founder and President, Galaxy Gameworks LLC