Fog of war

Hi all,

I’m guessing that this is something that has been done before but
I’m trying to think of the best way to do a ‘fog-of-war’ effect…

I’ve got a screen display at 640x480x16bpp, which is a window onto a
much larger area (3.4k by 2.5k) and I want the “fog” to only reveal what
the character has seen at any time before that. Ideally what I want is a
2-stage fog, such that:

0 -> black pixels
1 -> greyed pixels - not as intense, but we've been here before
2 -> currently visible pixels.

It seems to me that this maps basically to a transparency effect, with
state ‘1’ = alpha 0.5. I already have the screen surface, so if I make
another surface just for the fog, I can use alpha blitting onto the
screen to black out the bits I don’t want. All I have to do to reveal
more area is blit a pre-prepared “visibility” image centred on the new
character position into the fog, and the next screen update will reveal
more area.

The thing is that I’m not sure how to set up an SDL software surface
with 1 colour (black) and 2 alpha bits, and doing it at 8-bits will
significantly increase my RAM use because the actual map is so large. Is
it in fact possible ?

Assuming it’s not… I can presumably call

int fogW = 27 * 128;
int fogH = 20 * 128;
surface *fog = SDL_CreateRGBSurface(SDL_SWSURFACE,
fogW, fogH, 16,
0xFF00, 0x00, 0x00, 0x00FF); // 16 Mb!
int rgb = SDL_MapRGB(fog->format, 255,0,0);

… to get myself a 16-bit (red! but I don’t care because it’s
transparent) including 8-bit alpha surface. Then I just use 0,128,255 as
alpha values for the non-visited,seen-before,and visible areas, and draw
in the colour (255,0,0) to let the underlying graphic surface shine through.

Is is possible to have just an alpha surface - no rgb ? I guess I could
just have 0,128,255 as alpha values if it were possible :slight_smile:

Am I barking up the right tree, or is there a better/more efficient way
to do it ? I’ve noticed that at 640x480x16, with all the tiles, and some
large alpha-blitted objects, I only make ~40 fps. That’s sufficient, but
slightly lower than I’d hoped, even after converting surfaces to the
screen format and setting RLEACCEL on the alpha-blits … OTOH, it’s
only a matrox G400 card on an Athlon 1800+ (1.5GHz) which is pretty
low-end by today’s standards, and it is running in Java :slight_smile:

Thanks in advance for any help - much appreciated.


IoDream wrote:

Hi, I found your problem interesting, but I won’t explain you any solution
here because I don’t have one. Maybe wait for Sam Lantiga’s answer, he’s the
creator of SDL, and now a works at Blizzard, the developpers of the Warcraft
well known series, where the concept of the fog-of-war first appeared.

I guess Sam’s the man, then. Well I gave him an indy, so hopefully he’ll
tell me what to do. (and not where to go [grin].

You said you run it in Java. Is there a Java/SDL binding i’m not aware of ?

I took the jsdl project bindings (available at sourceforge), and
converted it to a CNI (rather than JNI) binding, and you can then
compile using ‘gcj’ (the java front-end to gcc).

The advantages are:

o It runs faster :slight_smile:
o Everything ends up in a single binary, and there’s no classpath
needed at runtime.

You’ll still need the shared libraries in the LD_LIBRARY_PATH (or cwd on
Win32) for sdl and libgcj, but the actual program is just a binary
without any properties needed in any configuration files. Far easier on
the end-user…

The disadvantages are:

o It’s harder work to get the Makefile sorted out (though it’s not
that bad)
o You have to deal with the classpath at compile-time (part of the
Makefile issue)

I’ve managed to get my 640x480x16 up to 67 fps now, having actually
set the clipmask - I thought I was setting it, but the rect was wrong :slight_smile:

If there’s sufficient interest, I’ll package up what I have with gcj and
put it on a website when it’s ready. There’s no sound in there yet (I’ve
been porting the bits I need, and I’ve not needed sound yet…) but most
of the other calls are there. I’ve not ported threads, since Java has
them natively, and other things (for SDL_GetTicks() I use
System.currentTimeMillis() for example) so it’s by no means complete. It
currently crashes on exit as well, so presumably I’m overwriting some
memory somewhere, which needs looking into. I’m going to start replacing
calls to malloc() and free() in the ‘C++’ code with JvNewByteArray().
which will then be garbage-collected as needed.

Basically, it’s faster than JIT java, slower than C++ (most of the time)
but has the inbuilt garbage collection and java standard libraries which
make programming Java a joy :slight_smile: