Displaying 8bpp surfaces in a 16bpp environment

I’m trying to display an 8bpp surface in my 16bpp X environment under
linux and what should be grey is coming out purple. I do have 8bpp
settings in my XF86Config file, though. So when I display the program
remotely to my HP at work (which is running in 8bpp) it displays grey. Any
ideas why? I’m sure I’m not doing something very simple right. thanks!

Ti Leggett
leggett at eecs.tulane.edu

I’m trying to display an 8bpp surface in my 16bpp X environment under
linux and what should be grey is coming out purple.

16bpp modes have less colour bits than 8bpp which can lead to
unexpected results. What you observe could be something like this:

Your grey colour, say #808080, in binary:

 red    green     blue

10000000 10000000 10000000

Your screen only stores 5 bits red and blue, and 6 bits green, thus:

10000 100000 10000

If we extend this to 8 bits, rounding correctly, we get

10000100 10000010 10000100

which is grey with a purplish tint. It’s also possible to get greenish
tints, depending on what shade of grey you started with.

To avoid this, either choose your colours so to minimize this effect,
or dither your image to the target display format

If you are seeing a more saturated purple, chances are that something
else is wrong (such as your code, or an sdl blit bug)

I’m not sure this is the case. See I have two programs. One’s a simple
starfield demo. It uses three shades of gray. All appear to be gray or
white. The other, the one I’m having problems with, loads in two raw
greyscale 8bpp images and blends them. Now on my linux box, it displays as
purple-scale (i.e., all grays are purple). Displayed from my linux box to
my HP it displays grey scale.On Tue, 31 Jul 2001, Mattias Engdegard wrote:

I’m trying to display an 8bpp surface in my 16bpp X environment under
linux and what should be grey is coming out purple.

16bpp modes have less colour bits than 8bpp which can lead to
unexpected results. What you observe could be something like this:

Your grey colour, say #808080, in binary:

 red    green     blue

10000000 10000000 10000000

Your screen only stores 5 bits red and blue, and 6 bits green, thus:

10000 100000 10000

If we extend this to 8 bits, rounding correctly, we get

10000100 10000010 10000100

which is grey with a purplish tint. It’s also possible to get greenish
tints, depending on what shade of grey you started with.

To avoid this, either choose your colours so to minimize this effect,
or dither your image to the target display format

If you are seeing a more saturated purple, chances are that something
else is wrong (such as your code, or an sdl blit bug)

I’m not sure this is the case. See I have two programs. One’s a simple
starfield demo. It uses three shades of gray. All appear to be gray or
white. The other, the one I’m having problems with, loads in two raw
greyscale 8bpp images and blends them. Now on my linux box, it displays as
purple-scale (i.e., all grays are purple). Displayed from my linux box to
my HP it displays grey scale.

First try displaying the same image using another tool — blend your images
and save the result to a file, and then use xv, showimage (that comes with
SDL_image), gimp or whatever you prefer. If the image looks purple, give us
an url to it. Otherwise, condense your program to 20 lines or so and
show the source code, so we can see if you are doing something wrong
(it must still be compilable of course)

I would try this but these are raw images and I don’t know of a viewer
under linux that supports raw images. I’ll go ahead and paste my code.

Sorry for the length, but I condensed as much as I could. If anyone wants
the raw images I’m using let me know and I’ll send them via personal
email. It’s C++.

BEGIN CODE

#include <iostream.h>
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include “SDL.h”

SDL_Surface *surface;

void blendImage( unsigned char *image1, unsigned char *image2, int coef )
{
Uint8 p = ( Uint8 )surface->pixels;

for ( int i = 0; i < 64000; i++ )
p[i] = image1[i] + ( ( image2[i] - image1[i] ) * coef >> 8 );

}

int main()
{
int videoFlags;

if ( SDL_Init( SDL_INIT_VIDEO ) < 0 )
return( 1 );

videoFlags  = SDL_DOUBLEBUF;
videoFlags |= SDL_HWPALETTE;
videoFlags |= SDL_HWSURFACE;
videoFlags |= SDL_HWACCEL;

surface = SDL_SetVideoMode( 320, 200, 8, videoFlags );

if ( !surface )
return( 1 );

SDL_Color colors[256];
for ( int i = 0; i < 256; i++ )
colors[i].r = colors[i].g = colors[i].b = i >> 2;
SDL_SetPalette( surface, SDL_LOGPAL | SDL_PHYSPAL, colors, 0, 256 );

unsigned char *pic1 = new (unsigned char)[64000],
*pic2 = new (unsigned char)[64000];
FILE *raw_file = fopen( "picture1.raw", "rb" );
fread( pic1, 1, 64000, raw_file );
fclose( raw_file );
raw_file = fopen( "mtbrison.raw", "rb" );
fread( pic2, 1, 64000, raw_file );
fclose( raw_file );
Uint32 startTime = SDL_GetTicks( );

bool done = false;
while ( !done )
{
    Uint32 testVal = ( ( SDL_GetTicks( ) - startTime ) / 10 ) %

1024;

    if ( SDL_MUSTLOCK( surface ) )
	if ( SDL_LockSurface( surface ) < 0 )
	    return( 1 );

    if ( testVal < 256 )
	memcpy( surface->pixels, pic1, 64000 );

    else if ( testVal < 512 )
	blendImage( pic1, pic2, testVal - 256 );

    else if ( testVal < 768 )
	memcpy( surface->pixels, pic2, 64000 );

    else
	blendImage( pic2, pic1, testVal - 768 );

    if ( SDL_MUSTLOCK( surface ) )
	SDL_UnlockSurface( surface );

    SDL_Flip( surface );
    
    SDL_Event event;
    while ( SDL_PollEvent( &event ) )
	{
	    switch ( event.type )
		{
		case SDL_QUIT:
		    done = true;
		    break;
		case SDL_KEYDOWN:
		    if ( event.key.keysym.sym == SDLK_ESCAPE )
			done = true;
		    break;
		}
	}
}
SDL_FreeSurface( surface );
delete [] (pic1);
delete [] (pic2);
SDL_Quit( );
return 0;

}On Tue, 31 Jul 2001, Mattias Engdegard wrote:

First try displaying the same image using another tool — blend your images
and save the result to a file, and then use xv, showimage (that comes with