SDL_SetVideoMode not working

Hi Folks,

I’ve setup SDL according to a tutorial (I’ve used SDL before, but removed it from my system awhile back) and now it doesn’t work.
A test program compiles fine, so all my libraries are in the right place and all settings are correct.

When I call SDL_Init there are no errors, but on calling

ScrDisplay = SDL_SetVideoMode( SCREEN_WIDTH, SCREEN_HEIGHT, SCREEN_BPP, SDL_HWSURFACE );

SrcDisplay remains NULL so what’s going on?? My video card supports hw surfaces, 4 bpp and a standard 800x600 res so I can’t see any problems on a hardware level…

I’ve recently installed the DirectX sdk and tools from Microsoft…could this have screwed things up??

Many Thanks
Ed

Send instant messages to your online friends http://uk.messenger.yahoo.com

4 bpp? Is that short for 4 bits per pixel (16 colors) or 4 bytes per
pixel? SDL_SetVideoMode()'s color-depth argument is in bits, not
bytes.

I don’t know, but I could imagine that your system won’t run with a
4-bit hardware surface.On 12/17/06, Edward Byard <e_byard at yahoo.co.uk> wrote:

SrcDisplay remains NULL so what’s going on?? My video card supports hw
surfaces, 4 bpp and a standard 800x600 res so I can’t see any problems on a
hardware level…


Rasmus Neckelmann

Hello !

4 bpp? Is that short for 4 bits per pixel (16 colors) or 4 bytes per
pixel? SDL_SetVideoMode()'s color-depth argument is in bits, not bytes.

I don’t know, but I could imagine that your system won’t run with a
4-bit hardware surface.

Even if your system would run with 4 bit Surfaces.
The problem is that, if i remember correctly, SDL
does not support that weird bitdepths.

Today standard bitdepths are 8 Bit, 16 Bit, 24 Bit and
32 Bit.

CU

Got it, thanks very much!> ----- Original Message -----

From: wizard@syntheticsw.com (Torsten Giebl)
To: A list for developers using the SDL library. (includes SDL-announce)
Sent: Sunday, 17 December, 2006 8:55:19 PM
Subject: Re: [SDL] SDL_SetVideoMode not working

Hello !

4 bpp? Is that short for 4 bits per pixel (16 colors) or 4 bytes per
pixel? SDL_SetVideoMode()'s color-depth argument is in bits, not bytes.

I don’t know, but I could imagine that your system won’t run with a
4-bit hardware surface.

Even if your system would run with 4 bit Surfaces.
The problem is that, if i remember correctly, SDL
does not support that weird bitdepths.

Today standard bitdepths are 8 Bit, 16 Bit, 24 Bit and
32 Bit.

CU


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Send instant messages to your online friends http://uk.messenger.yahoo.com

The easy way to tell if it’s bits or bytes is by context. 4 bpp definitely means ‘bytes’ because 4bytes x 8bytes/bit = 32 bits, a very common number for bits per pixel, especially if his resolution is a ‘standard 800x600’.> Date: Sun, 17 Dec 2006 21:49:22 +0100> From: neckelmann at gmail.com> To: sdl at libsdl.org> Subject: Re: [SDL] SDL_SetVideoMode not working> > On 12/17/06, Edward Byard <e_byard at yahoo.co.uk> wrote:> > SrcDisplay remains NULL so what’s going on?? My video card supports hw> > surfaces, 4 bpp and a standard 800x600 res so I can’t see any problems on a> > hardware level…> > 4 bpp? Is that short for 4 bits per pixel (16 colors) or 4 bytes per> pixel? SDL_SetVideoMode()'s color-depth argument is in bits, not> bytes.> > I don’t know, but I could imagine that your system won’t run with a> 4-bit hardware surface.> > – > Rasmus Neckelmann> > _______________________________________________> SDL mailing list> SDL at libsdl.org> http://www.libsdl.org/mailman/listinfo/sdl


Express yourself with gadgets on Windows Live Spaces
http://discoverspaces.live.com?source=hmtag1&loc=us

4 bits per pixel was a very standard bitdepth “back in the days” when
CGA, EGA and dinosaurs ruled the planet, but of course isn’t of that
much concern anymore. Maybe Edward, for some strange reason, desired
to use that bitdepth anyway.On 12/18/06, Jonathan Dearborn wrote:

The easy way to tell if it’s bits or bytes is by context. 4 bpp definitely
means ‘bytes’ because 4bytes x 8bytes/bit = 32 bits, a very common number
for bits per pixel, especially if his resolution is a ‘standard 800x600’.


Rasmus Neckelmann

4 bits per pixel was a very standard bitdepth “back in the days” when
CGA, EGA and dinosaurs ruled the planet, but of course isn’t of that
much concern anymore. Maybe Edward, for some strange reason, desired
to use that bitdepth anyway.

Nope, I just cut ‘n’ pasted some example code which had this 4 bpp as a define - so that’s what threw me.
Nevermind, it all works now :slight_smile:

Ed___________________________________________________________
The all-new Yahoo! Mail goes wherever you go - free your email address from your Internet provider. http://uk.docs.yahoo.com/nowyoucan.html

The easy way to tell if it’s bits or bytes is by context. 4 bpp
definitely
means ‘bytes’ because 4bytes x 8bytes/bit = 32 bits, a very common
number
for bits per pixel, especially if his resolution is a ‘standard
800x600’.

4 bits per pixel was a very standard bitdepth “back in the days” when
CGA, EGA and dinosaurs ruled the planet, but of course isn’t of that
much concern anymore. Maybe Edward, for some strange reason, desired
to use that bitdepth anyway.

I started with 1 bit… I pwn you!

Anyway, bpp means bits per pixel. I always used it as bits per pixel.
So as I am the reference, I win!

More seriously, check:

And, if I remember correctly, VGA uses 4 bpp. But as my brain has
been shutdown for xMas (yea I know, 1 week earlier), I can’t tell.

Regards and best wishes to anybody, even if you think I’m a kook.On 18 Dec 2006, at 17:48, Rasmus Neckelmann wrote:

On 12/18/06, Jonathan Dearborn wrote:


Kuon
CEO - Goyman.com SA
http://www.goyman.com/

“Computers should not stop working when the users’ brain does.”

-------------- next part --------------
A non-text attachment was scrubbed…
Name: smime.p7s
Type: application/pkcs7-signature
Size: 2434 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20061219/4e0a0f44/attachment.bin