Open GL, Linux, Windows

Loren Osborn wrote:

— David Olofson <david.olofson at reologica.se> wrote:

If you render a grid of totally independent quads,
will the edges be properly antialiased, without the
background leaking through around the edges?

//David

Hmm… I think you could aliviate the problem by
having each of the tiles overlap by 2 pixels: The
bottom tile-edge would be opaque, but the edge-most
pixel would be identical to the pixel beside it (so
that the 2 pixel seam is two identical rows of
pixels), and the top tile edge’s outer-most pixels
would be transparant, while the inner pixels (one
pixel from the “true” edge) would be opaque. Any
streatched pixel on the boarder, should show through
to the 2-pixel seam behind it, and be interpolated
correctly. (I think)…

Comments?

-Loren

If we’re in a 2D world, since all the edges line up exactly, I think we
don’t need anti-aliasing.

In my own tile-based image viewer the seams are hidden by this method,
on all but one apparently buggy Linux driver. Even then they are
usually hidden. I don’t stretch my textures, but I don’t see how that
would affect this.

                                                -ray

Michael Schnell wrote:

== Urspr?ngliche Mitteilung von sdl at libsdl.org am 18.06.02 05:08

If you are interested in a dual-mode renderer, the OpenGL and SDL combo
works great. I have been doing this at work. My program tries to run
in OpenGL at first, but it times its frame-drawing speed, and if it
takes to long to draw in OpenGL it falls back to software. Has worked
perfectly on every computer and OS I have tried (various Windows, OSX,
Linux, OS9).

It would be great to see a little example.

I’ll see about making a version without the 80 megs of data sets for you
to download. :slight_smile:

Of course the drawback of a dual-mode renderer is that when in SDL mode,
you would need to implement your own scaling functions (unless some SDL
lib covers this).

In this thread I showed an example in this thread that uses
"SDL_BlitSurface(zoom, @src, screen_, @upd)" (via JediSDL) and this works
fine in pure GUI mode (Windows and Linux).

So I’d need to create an OpenGL work-alike using “mapping textures onto
polygons” and use same when appropriate. I need to find out how to
initialize OpenGL in a way that it can live in a rectangle in the GUI,
convert the coordinated of the rectangle to OpenGL coordinates, fill the
large texture “polygon” (source rectangle) and how to have OpenGl show a
certain part of the texture rectangle in my viewing rectangle in
appropriately stretched way.

I’m not sure why you would need to make your OpenGL work-alike map
textures onto polygons (beyond the usual SDL_rect), perhaps for maximum
code-reuse between SDL and OpenGL modes? Anyway, I’d say that your plan
is sounding pretty good overall.

Not sure about how you will initialize SDL "in a rectange in the GUI"
but hopefully it works. I’ll try to remember to package a
small-download version of my viewer tonight. Although not perfect, most
of it works on all machines I have tested.

-ray skoog

[…]

If we’re in a 2D world, since all the edges line up exactly, I think we
don’t need anti-aliasing.

In my own tile-based image viewer the seams are hidden by this method,
on all but one apparently buggy Linux driver. Even then they are
usually hidden. I don’t stretch my textures, but I don’t see how that
would affect this.

That’s exactly why you’ve never seen this problem. As soon as you enable texture filtering and then stretch and/or use sub-pixel accurate positioning, edges will look terrible. They’re still moving with pixel granularity, and interpolation makes whatever is around each tile in the texture “leak” into the edges.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Tue, 18/06/2002 07:43:14 , Ray Skoog wrote:

Not sure about how you will initialize SDL "in a rectange in the GUI"
but hopefully it works.

Using Delphi or Kylix with Jedi_SDL:

   {$IFDEF WIN32}
   SDL_putenv('SDL_VIDEODRIVER=windib');
   {$ENDIF}

   EnvVal := 'SDL_WINDOWID=' + inttostr(QWidget_WinId(Panel1.Handle));
   putenv( PChar( EnvVal ) );

It does work (unless you try to use Direct X by

   SDL_putenv('SDL_VIDEODRIVER=directx');

I’ll try to remember to package a

small-download version of my viewer tonight. Although not perfect, most
of it works on all machines I have tested.

G R E A T ! ! !

Thanks,

-Michael Schnell, Krefeld, Germany,
mailto:@Michael_Schnell

You can buy a new GeForce4 64MB DDR TVout for 180$, and a GeForce2 64MB
SDR TVout for 120$. I’d suggest the GeForce4. If you have much money to
burn out, the best actual video card is the Radeon 8500 (240$+).

The Radeon takes shortcuts in rendering I don't like (per-poly mipmap for
example.)  Despite the fact that it's supposed to be faster given all
indications, it's actually SLOWER by more than a good margin.  Why?  If
you want my opinion, it's simply that ATI sucks it hard.  ;)

Not all GF4 cards are created equal.  GF4 MX cards are basically just GF2
cards with a little added feature here and there.  If you're looking at a
GF4 MX, split the difference and get yourself a GF3.  You'll be happier
with it in the long run.  GF4 Ti4200 sells similarly to the high-end GF4
MX card, isn't that much slower, and has all the bells and whistles of its
big brothers the Ti4400 and Ti4600.

Actually I meant a GeForce4 MX440.

17/06/02 17.56.37, Jacek Wojdel wrote:

(…) I need to have a window showing an hw surface, then resize and antialias it
realtime using hardware accel while I grab one of its borders with the mouse. DirectX does it (I wrote an
example

in VB6).

You mean something like this ?
http://www.kbs.twi.tudelft.nl/People/Staff/J.C.Wojdel/stretchblitgl/
(OK, it’s not the window that is resized and you don’t really grab the
rectangle but it follows the mouse, but the drawing should be exactly what
you need)

That program generates a segfault in the kernel. Sorry i had no time (nor competence - I know practically
nothing about openGL) to debug it.>On Mon, Jun 17, 2002 at 04:17:27PM +0200, CRV?ADER/KY wrote:

Yes, truely there was a stupid mistake in the color-filling code for texture
data, I wonder why it didn’t segv at my machine…
It’s corrected now.
JacekOn Fri, Jun 21, 2002 at 12:26:00AM +0200, CRV?ADER/KY wrote:

17/06/02 17.56.37, Jacek Wojdel <@Jacek_Wojdel> wrote:
That program generates a segfault in the kernel. Sorry i had no time (nor competence - I know practically
nothing about openGL) to debug it.

±------------------------------------+
|from: J.C.Wojdel |
| J.C.Wojdel at cs.tudelft.nl |
±------------------------------------+

== Urspr?ngliche Mitteilung von sdl at libsdl.org am 18.06.02 23:30

I’ll try to remember to package a

small-download version of my viewer tonight. Although not perfect, most
of it works on all machines I have tested.

=== Kommentar von MSCHNELL at MARIANNE (Michael Schnell) am 23.06.02 11:44

I d/lded it at tested the Windows version in a non-OpenGL environment.

  1. it does work, so this seems to be a good starting point for my work.

  2. the higher zoom factors only produce black images. So there is more to be
    investigated. Does this work with OpenGL and/or in Linux ?

Thanks a lot !

-Michael Schnell, Krefeld, Germany,
mailto:@Michael_Schnell

That program generates a segfault in the kernel. Sorry i had no time (nor competence - I
know practically
nothing about openGL) to debug it.
Yes, truely there was a stupid mistake in the color-filling code for texture
data, I wonder why it didn’t segv at my machine…
It’s corrected now.
Jacek

Great ! I’ll try to run this as soon as I get an GpenGL enabled video card.

Does this work in Windows, too ?

Thanks,

-Michael Schnell, Krefeld, Germany,
mailto:@Michael_Schnell