As I can see it, all this is just a headache.
I have to admit that I haven’t got any solid theoric bases on dynamic
compilation or SIMD extensions, so everything is strictly IMHO…
There are several methods to use at 100% the capabilities of the CPU
you’re running on:
-
distribute the sources and make the compiler on the target machine do
all the needed optimizations; i.e. a gcc compiler on a athlon-xp will use
CFLAGS=-O3 -march=athlon-xp -fomit-frame-pointer -mfpmath=sse
Classical examples are MPlayer or Gentoo Linux (I use it. And it’s faaaast!)
There are several drawbacks: it can be done only on *NIX systems, as in
windows very few people have a C/C++ environment installed and even
fewer have the same you have. Many people distribute for their
projects GNU Makefiles along as Visual C++ project files. I say they are
just a bit masochistic.
Even in *NIX systems, compiling from sources is surely more complex than
doing “rpm -i package” or such; not all users could have gcc and make
installed. This counts double for videogame players.
-
Distribute several binary packages of your application. The user will
then choose which one to install.
The problem is that you’ll have to distribute LOTS of files! For
example, you’ll have, at least and for every version:
SDL-1.2.6-win32-i386.zip
SDL-1.2.6-win32-i686.zip
SDL-1.2.6-win32-pentium3.zip
SDL-1.2.6-win32-pentium4.zip
SDL-1.2.6-win32-athlon.zip
SDL-1.2.6-win32-athlon-xp.zip
SDL-1.2.6-linux-i386.tgz
SDL-1.2.6-linux-i686.tgz
SDL-1.2.6-linux-pentium3.tgz
SDL-1.2.6-linux-pentium4.tgz
SDL-1.2.6-linux-athlon.tgz
SDL-1.2.6-linux-athlon-xp.tgz
That’s decisely unhandy. Plus, if your package is being distributed by
someone else (i.e. a Linux distro), he will almost certainly provide
only the i386 or the i686 version.
-
Distribute a single package with all the different binaries and a
wrapper script that launches the correct executable. The main problem is
the package size; in fact big programs will have their sizes multiplied
(at least) by 6.
-
Dynamic compilation. I don’t know how it works, then I’ll just skip it
-
Dynamic linking (my choice). The only drawback in this one is that
you become tied to a specific compiler/make environment and you have to
rewrite a lot of code if you decide to change it. However, gcc works
with almost any platform, so there isn’t really such a problem…
The principle is that some “critical” functions (the ones called
thousands of times a second - to find out which ones they are, use a
profiler) are compiled several times, in different modules and with
different optimizations; their names are redefined each time; then
they’re linked all together. A wrapper function calls the correct one at
runtime.
To have a working example (and also a much better explanation) of how
this works, take a look at the Electric Field Simulator on my site:
http://www.crusaderky.altervista.org/downloads.php
I’m sorry there isn’t any documentation on the site yet; however, you
can read the “Multiarch-README” from inside the package, as well as look
at the sources.–
[] Guido Imperiale
[] CRV?ADER//KY
[] CVI.SCIENTIA.IMPERIVM
crusaderky at libero dot it
http://www.crusaderky.altervista.org
“Nam et ipsa scientia potestas est” (Knowledge is Power)
– Sir Francis Bacon (1561-1626)
Meditationes Sacrae, de Haeresibus
“The Net treats censorship as damage and routes around it.”
– John Gilmore
“I worry about my child and the Internet all the time, even though she’s
too young to have logged on yet. Here’s what I worry about. I worry that
10 or 15 years from now, she will come to me and say: ‘Daddy, where
were you when they took freedom of the press away from the Internet?’”
– Mike Godwin, Electronic Frontier Foundation
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20031021/4000a4f7/attachment.pgp