SDL2.0 wish-list

Hello,

Selon Gerry :

Xavier Joubert wrote:

Of course, painting programs badly need accurate alpha blending and don’t
care
that much about speed.
Wrong. Accurate alpha blending is very important, yes, but speed is
also very important. There’s few things more frustrating than getting
line segments every time you try drawing a curve because the CPU can’t
keep up.

That’s why I wrote “don’t care that much about speed.”

I know nothing is more irritating than a pen that don’t follow your (my) drawing
speed.

But, as David said, any average PC is fast enough to do software rendering at a
decent speed. And we’re talking here about a pen (even if it’s big) and not
many sprites all over the screen.

So I think software rendering should do the job quite well.

If you don’t agree, I think we should begin by implementing this in software,
anyway. If it’s too slow, then we might consider using hardware acceleration.

Best regards,

Xavier

I made a couple improvements to the code that you wrote that you might
be interested in if you end up writing Altivec stuff in the future. The
big one is that I’m using 15 as the constant for the overflow vector
load, so it doesn’t need to overcompensate for overrun since you’re
guaranteed not to stomp on a bad page. This is what Apple recommends.

Hhm, good catch.

The only other place I did serious Altivec work was my OpenAL
implementation, and I just overallocated all relevant buffers by 16
bytes so that the code couldn’t ever trigger a segfault in this means,
but using 15 with vec_ld is clearly a better solution.

Unrelated, but I still think we should change the dont-use-prefetch test
to not consider the presence of an L3 cache. I’ll be happy to benchmark
on G4s with and without L3 verses the G5, but I suspect we’ll see a
performance boost across the G4 line using prefetch. My guess about
avoiding prefetch on the G5 may turn out to be inaccurate in real world
scenarios, too.

Should I be asking someone for CVS access, or is somebody else going to
maintain this code?

I’ll put it in CVS and take it from there.

As for test cases, the world is our beta tester. :slight_smile: If a bug shows up,
someone will say something and we’ll fix it, but I think you’ve gone out
of your way to reasonably test this.

–ryan.

Your explanations are very interesting.

I knew something was wrong with alpha blending done the current way, but didn’t
knew the rational behind this. I saw this recently in a small painting program
I’m writing for my daughter (and in some old painting programs in the past).

:^) That’s one of the things Albert is working on, and which I think is
his reason for interest in seeing this change in SDL… he’s helping us
write “Tux Paint”, which is an SDL-based drawing program for young kids.

(Since you’re working on one of your own, I’d be interested in seeing it,
and curious to hear your thoughts on Tux Paint.)

See: http://www.newbreedsoftware.com/tuxpaint/

> I see one way to handle this dilemna : implement a new > SDL_BlitSurfaceAccurateBlending() function that would use sRGB conversion. > This one would of course not be hardware accelerated, and so not recommended > on hardware surfaces. > > Maybe this function would better fit an add-on lib than SDL itself...

Not a bad idea…On Sat, Feb 26, 2005 at 11:04:26PM +0100, Xavier Joubert wrote:


-bill!
bill at newbreedsoftware.com “I’m anticipating an all-out tactical
http://newbreedsoftware.com/ dog-fight, followed by a light dinner.”

Admittedly, a problem I’ve been having on my old PC lately in both
Tux Paint and The Gimp. I think I have too many KDE doo-dads enabled
for my 450MHz (that’s /overclocked/) CPU to handle. :slight_smile:

Speed is a big consideration. Consider the stamps in Tux Paint.
Right now, when you go to place one, you see an XOR outline of the shape
(in a stipple pattern that Albert, I believe, improved greatly just recently).

If we changed this to, say, a 50% alpha-blended blit, that’d be a large
(sometimes ~400x400 or larger) alpha blit EVERY time the mouse moved. :^o

Nice looking effect, but it could become a total CPU hog.

-bill!
bill at newbreedsoftware.com "I’m anticipating an all-out tactical
http://newbreedsoftware.com/ dog-fight, followed by a light dinner."On Sun, Feb 27, 2005 at 03:25:07AM +0100, Gerry wrote:

Wrong. Accurate alpha blending is very important, yes, but speed is
also very important. There’s few things more frustrating than getting
line segments every time you try drawing a curve because the CPU can’t
keep up.

Olof Bjarnason wrote:

I am just curious of what we can hope for in SDL2. Is there any
timeline anywhere? Or better yet design draft?

Here is my personal wish list:

  • Keeping the C-level of the library and the general
    "minimalistic/clean" API of SDL
  • Dropping backward-source-level sdl<2 compatibility if so needed
  • Multiple display surfaces - at least one with OpenGL support
  • ForceFeedback-support
  • Multiple mouse support or in general no assumptions on the number of
    devices of things (keyboard, mouse, screens, …)
  • A little more vaguely: Support for newer multimedia hardware like
    webcameras etc. - a grabbing interface

My wishlist is actually to get my patches merged in, but this doesn’t
seem to happen (right now I have ~500Kb of patches ; I’m at the point
where it is starting to take a significant time to track SDL cvs).
So I agree on your “timeline” or “design draft” point. Is there an
official stance on what to expect from SDL 2.0, so that I could pursue
(or stop accordingly) development of the features I’m interested in ?

Stephane

Your explanations are very interesting.

I knew something was wrong with alpha blending done the current way, but didn’t
knew the rational behind this. I saw this recently in a small painting program
I’m writing for my daughter (and in some old painting programs in the past).

:^) That’s one of the things Albert is working on, and which I think is
his reason for interest in seeing this change in SDL…

Mostly yes, but other apps deserve to look right too.

I see one way to handle this dilemna : implement a new
SDL_BlitSurfaceAccurateBlending() function that would use sRGB conversion.
This one would of course not be hardware accelerated, and so not recommended
on hardware surfaces.

Maybe this function would better fit an add-on lib than SDL itself…

Not a bad idea…

That would appear to:

a. prohibit hardware acceleration
b. leave the main SDL library with wrong alpha blending

Using correct alpha blending should be easier than using
wrong alpha blending. We’ve seen what happened with audio
byte order, and that was way easier to explain. Using the
wrong alpha blending should require one of:

a. link in an extra library
b. define USE_BROKEN_ALPHA before including the headers
c. pass a flag to SDL_Init

Maybe require all 3. :slight_smile: Actually, I’d rather just not
have the broken version. Why encourage people to screw up?

I could go for a transparency threshold though, so the
user can have SDL treat the alpha as a boolean mask.On Mon, 2005-02-28 at 11:15 -0800, Bill Kendrick wrote:

On Sat, Feb 26, 2005 at 11:04:26PM +0100, Xavier Joubert wrote:

My wishlist is actually to get my patches merged in, but this doesn’t
seem to happen (right now I have ~500Kb of patches ; I’m at the point
where it is starting to take a significant time to track SDL cvs).

Is there a list of what patches you’ve got over CVS? We should probably
strive to at least sort them into “should be in libsdl.org’s CVS” and
"will never be in libsdl.org’s CVS," and figure out what to do from there.

–ryan.

Calling the sRGB to/from linear conversions "color space conversions"
is a bit misleading. The black point, white point, and primaries all
remain the same. It’s really a gamma conversion, using the special
gamma curve defined by the sRGB standard.

Yes, it is gamma correction, and therefore is a separate operation from
alpha blending. Not even vaguely related. Since the rest of what you are
talking about is all based on your confusing alpha blending and gamma
correction, it is time to drop this subject.

I’m not confused. Perhaps you are? I hope you don’t wish to
simply pretend this problem doesn’t exist. I know it’s ugly
to fix, but it looks ugly on the screen too.

Sorry, but you are seriously confused. You have combined to different
problems and therefore have a rather unique take on how to correct them.

First off, most of the problem with “dark edges” that are seeing is due
to software alpha blending that is not clamping colors components to the
range 0 to 255. The use of 1/256 means that you can get 256 as a result
of alpha blending and that wraps to zero. Which gives you very dark
colors.

The situation:

a. All alpha blending must be done with linear (gamma==1.0) data.
Any other use of alpha blending is a serious bug.

That is one way of looking at it. Another is that alpha blending is
defined to work over a linear space and therefore doesn’t know or care
about non-linear data.

b. All modern computer displays require non-linear (gamma!=1.0)
data by default; many are not configurable.

Which, of course, has nothing to do with alpha blending because alpha
blending is defined to work over a linear range.

The gamma problem is completely separate from the way alpha blending is
defined to work.

BTW, I have never encountered a display that “required” non-linear data.
Yes, it would not provide the correct color with out it, but so what,
they still worked. Also, it has been a long time since I have seen a
video card that did not support gamma correct on the output. It has been
nearly as long since I saw anyone actually configure the gamma
correction for a display.

I once worked with a man who could see Mach banding in a 24 bit color
display. I, personally, can not.

c. The vast majority of image files contain non-linear data.

That is an assertion on your part. I personally doubt that it is true.
But, there are certainly image files that have non-linear data in them.
If you want to use do alpha blending with them you are responsible for
converting them to linear form. That is, you need to apply the source
gamma to the image so that alpha blending will work.

Suppose I wish to do an alpha-blending blit to the screen.
In general, SDL is unable to correctly blit the image!

No, SDL correctly blits the image. If it isn’t correct, it is because
you have not correctly adjusted the image data before you did the
blit. If you have linearized the images and the result looks wrong
it is most likely because you have not correctly set the output gamma.

Various ways to deal with this:

a. Only use alpha==100% and alpha==0%. :slight_smile:

b. sRGB->linear on file load, and linear->sRGB to the screen
(so all in-memory buffers are linear)

That is pretty much the only way to do it.

c. alpha blend operation gets built-in gamma conversion

The problem with that approach is that even if you did mangle the
software blitters in SDL to do the odd operations that you want done,
you would still not get the results you are asking for on systems that
are doing the blits using hardware blitters. Hardware blitters work
pretty much the way SDL’s software blitters work. Hardware blitters
usually get the 1/255 right and usually clamp the result properly (I say
usually because I have worked with some that did neither). But, they do
not try to do linearizion while doing alpha blending.

Good grief, you would have to provide a gamma table with each image to
get the results you want.

You can verify what I have said by reading any introductory text on
computer graphics.

Enough

	Bob PendletonOn Sat, 2005-02-26 at 21:58 -0500, Albert Cahalan wrote:

On Sat, 2005-02-26 at 18:48 -0600, Bob Pendleton wrote:

On Sat, 2005-02-26 at 14:50 -0500, Albert Cahalan wrote:


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

This isn’t specifically a cry for help (yet) just a feeler to see if
anyone else has had this problem.

I’m currently wrapping up SDL for use under the Compact Framework. For
the most part everything is going well, except for the eventing system.
Unfortunately, the CF doesn’t support union structures, so I’m having to
do a little jiggery-pokery to get it working properly.

I seem to have it working, but when I use the SDL_PollEvent method and
then exit my app, some kind of error occurs. (Other handheld developers
will recognize the “Connection to the remote device has been lost” error.)

If I don’t call SDL_PollEvent, I get no error on exit.

Does this ring a bell for anyone? I’m not tremendously concerned by it at
this point since the application works until you are done.

My SDL.dll is compiled using eVC3 for ARM, btw.

Ryan C. Gordon wrote:

My wishlist is actually to get my patches merged in, but this doesn’t
seem to happen (right now I have ~500Kb of patches ; I’m at the point
where it is starting to take a significant time to track SDL cvs).

Is there a list of what patches you’ve got over CVS? We should
probably strive to at least sort them into “should be in libsdl.org’s
CVS” and “will never be in libsdl.org’s CVS,” and figure out what to
do from there.

Ok, I will sort what I have when I find the time (while I’m at it, I’ll
look at what I have lying on my hd). But my comment for what features
are expected in the future still stands :slight_smile:

Stephane

Haven’t read all the posts here - so maybe that has already been
mentioned:

It would be nice to have sdl net and sdl image (perhaps mixer) included
in sdl 2.0.

BR Arne

It would be nice to have sdl net and sdl image (perhaps mixer) included
in sdl 2.0.

No, and let’s not start this discussion again.

–ryan.

It would be nice to have sdl net and sdl image (perhaps mixer)
included in sdl 2.0.

No, and let’s not start this discussion again.

Okok :slight_smile:
Could you name the thread where that has been discussed?

BR ArneAm 18.03.2005 um 16:31 schrieb Ryan C. Gordon:

It would be nice to have a parameters parser in SDL, something like GTK does

mysdlapp --sdl-vo=gl --sdl-ao=esd --my --app --parms

and, in code:

SDL_Init(SDL_INIT_VIDEO…, Args);

I don’t know how to get the parameters in C, I use pascal…, but you
got the ideia…

So, SDL will use ENV VARS or the parameters… and we will get a
default way to do this, that works on every SDL app…

Excuse my poor english…
TIA–
Judison
@Judison

Hey, that’s not a bad idea!

-bill!On Sat, Mar 19, 2005 at 03:47:33PM -0300, Judison wrote:

It would be nice to have a parameters parser in SDL, something like GTK does

mysdlapp --sdl-vo=gl --sdl-ao=esd --my --app --parms

and, in code:

SDL_Init(SDL_INIT_VIDEO…, Args);

I agree, I would like very much to see this feature too.
Either pass the Args in SDL_Init, or better yet pass them in another
function, something like:
SDL_Init_Args(char argc, char* argv);

In fact SDL_Init_Args() = SDL_main()

Just fix it it so that (at least in C) we get done with that ugly,
ugly hack of redefining main().

… It is an ugly hack, isn’t it?On Sat, 19 Mar 2005 15:47:33 -0300, Judison wrote:

It would be nice to have a parameters parser in SDL, something like GTK does

mysdlapp --sdl-vo=gl --sdl-ao=esd --my --app --parms

and, in code:

SDL_Init(SDL_INIT_VIDEO…, Args);


Bruno Medeiros.
“Knowledge is Power”