Hide ouput of the SDL_SetVideoMode

I’m wondering if by SDL version 1.3 a new flag can be added to the
setvideo mode such that one can do all the output in a frame or other
application like this example:
http://code.technoplaza.net/wx-sdl/part1
without having a background window come up that isn’t being used at
all, and do all the work in just the sdl surfaces with an output
frame.

a new flag to go with the two existing flags, SDL_RESIZABLE and SDL_NOFRAME …

I’d suggest adding the flag, SDL_NOVIDEOOUTPUT for these type of circumstances.

what about the null video driver?On 7/3/06, Ken Phillis wrote:

I’m wondering if by SDL version 1.3 a new flag can be added to the
setvideo mode such that one can do all the output in a frame or other
application like this example:
http://code.technoplaza.net/wx-sdl/part1
without having a background window come up that isn’t being used at
all, and do all the work in just the sdl surfaces with an output
frame.

a new flag to go with the two existing flags, SDL_RESIZABLE and SDL_NOFRAME …

I’d suggest adding the flag, SDL_NOVIDEOOUTPUT for these type of circumstances.


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

how do i enable that… and also keep my hardware based acceleration?On 7/3/06, Ren? Dudfield wrote:

what about the null video driver?

how do i enable that… and also keep my hardware based acceleration?

The wxWidgets example program that was posted isn’t using an SDL
hardware surface at all. They could have stripped out the SDL code and
used a malloc for the SDL_CreateRGBSurface() call, since they requested
a SWSURFACE and because they needed to explicitly lock it to touch
pixels anyhow…nor is it using a hardware-accelerated blit (it copies
the SDL software surface to a wxBitmap.

Frankly, that example code would probably run faster if they removed the
SDL dependency. :slight_smile:

–ryan.

yea, well i did a bit more and found a easier method, although i wish
a function that will reproduce the desired effects by replacing this
call…

SDL_SetVideoMode(0, 0, 0, SDL_SWSURFACE);

with…

SDL_VideoInit(“dummy”,SDL_SWSURFACE);

but it’s a very unorthodox and unusual method, and i would prefer to
instead call a function like…

void* SDL_SDL_VideoRun();

which automatically initializes the drivers such that it won’t fail.On 7/3/06, Ryan C. Gordon wrote:

how do i enable that… and also keep my hardware based acceleration?

The wxWidgets example program that was posted isn’t using an SDL
hardware surface at all. They could have stripped out the SDL code and
used a malloc for the SDL_CreateRGBSurface() call, since they requested
a SWSURFACE and because they needed to explicitly lock it to touch
pixels anyhow…nor is it using a hardware-accelerated blit (it copies
the SDL software surface to a wxBitmap.

Frankly, that example code would probably run faster if they removed the
SDL dependency. :slight_smile:

–ryan.

SDL_SetVideoMode(0, 0, 0, SDL_SWSURFACE);

with…

SDL_VideoInit(“dummy”,SDL_SWSURFACE);

I think Sam is adding (or already added) something like this for SDL
1.3, so you can enumerate and select a specific video driver.

Naturally, there’s a difference between SDL_Init and SDL_SetVideoMode,
even in 1.2 …

but it’s a very unorthodox and unusual method, and i would prefer to
instead call a function like…

void* SDL_SDL_VideoRun();

which automatically initializes the drivers such that it won’t fail.

Can’t you just call SDL_CreateRGBSurface() without calling
SDL_SetVideoMode()? Maybe it’s illegal, but I think it should work as
long as SDL_Init() succeeded. (I think…) This could be handy for
getting access to, say, the surface format converters, even if you don’t
specifically plan to use SDL to blit to the screen.

–ryan.

it probobly would have, but it was an example on how to initialize an
SDL surface in wxWidgets, and not a tutorial on how to use the
wxWidgets Client Drawing features.

anyways some extra tweaks on that would have done a lot more ( because
i have also had to do some things to that to create a pause/play
system which complicated the setup a lot )

SDL_SetVideoMode(0, 0, 0, SDL_SWSURFACE);

with…

SDL_VideoInit(“dummy”,SDL_SWSURFACE);

I think Sam is adding (or already added) something like this for SDL
1.3, so you can enumerate and select a specific video driver.

Naturally, there’s a difference between SDL_Init and SDL_SetVideoMode,
even in 1.2 …

but it’s a very unorthodox and unusual method, and i would prefer to
instead call a function like…

void* SDL_SDL_VideoRun();

which automatically initializes the drivers such that it won’t fail.

Can’t you just call SDL_CreateRGBSurface() without calling
SDL_SetVideoMode()? Maybe it’s illegal, but I think it should work as
long as SDL_Init() succeeded. (I think…) This could be handy for
getting access to, say, the surface format converters, even if you don’t
specifically plan to use SDL to blit to the screen.

–ryan.On 7/3/06, Ryan C. Gordon wrote:

Hello !

Can’t you just call SDL_CreateRGBSurface() without calling
SDL_SetVideoMode()? Maybe it’s illegal, but I think it should work as
long as SDL_Init() succeeded. (I think…) This could be handy for getting
access to, say, the surface format converters, even if you don’t
specifically plan to use SDL to blit to the screen.

I do this. I init SDL_Init (SDL_INIT_VIDEO)
with the Dummy Video driver and then use the SDL
blitting functions for my app. I do not set any Videomode with
SDL_SetVideoMode. I also use SDL_CreateRGBSurface
in my app a lot to create tiles :

http://www.turricane.net/diary/images/T4ENewEd.png

CU

Hello !

I do this. I init SDL_Init (SDL_INIT_VIDEO)
with the Dummy Video driver

But when using the Dummy Video driver you should
be able to also use SDL_SetVideoMode, but i never
tested it.

CU

What is the Dummy Video driver?.

2006/7/4, Torsten Giebl :>

Hello !

I do this. I init SDL_Init (SDL_INIT_VIDEO)
with the Dummy Video driver

But when using the Dummy Video driver you should
be able to also use SDL_SetVideoMode, but i never
tested it.

CU


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours

Hello !

What is the Dummy Video driver?.

It is a Videodriver that does nothing,
okay it does a little bit. It emulates
the video functions using the software blitters,
but it does not open a Window or outputs anything on the
screen.

SDL_putenv (“SDL_VIDEODRIVER=dummy”);
or just set the variable on the OS you
are using and then start the app.

CU

What is the Dummy Video driver?.

It’s a SDL video target that does nothing. SDL_SetVideoMode() will claim
to succeed, and it’ll supply you with a memory buffer that is supposed
to be the screen surface, but it doesn’t ever create a window, etc. It
has no platform-specific dependencies, since all it does is allocate a
memory buffer and not much else.

It’s partially meant for bootstrapping SDL on a new platform (you can
just use the dummy video driver so you know the library is working
before implementing any platform-specific code), but it’s also handy for
unusual cases like this wxWidgets thing we were discussing (using video
functions, but not using SDL to render to the screen), or benchmarking
your program, in cases when you care about your code’s performance and
not the screen blit overhead. It’s 2D only, and thus can’t be used with
OpenGL.

There is also a dummy audio target, which is good for the same reasons,
and also for games where the developer didn’t supply a means to turn off
the sound output.

You can explicitly choose the dummy drivers, if they were compiled in,
with an environment variable:

in bash:
SDL_VIDEODRIVER=dummy ./mygame

in Windows:
set SDL_VIDEODRIVER=dummy
mygame.exe

(or you can use SDL_putenv() in your app.)

We’ll have a real mechanism for selecting drivers that doesn’t use
environment variables in SDL 1.3.

–ryan.

Hello !

We’ll have a real mechanism for selecting drivers that doesn’t use
environment variables in SDL 1.3.

Then there is no way to force a
special video driver from the command line ?

CU

Then there is no way to force a
special video driver from the command line ?

I presume we’ll leave the environment variables there, but you will be
able to control it from the app without using them via an API.

–ryan.