Problem with glRenderMode(GL_SELECT) in Multimonitor environment

Good Day!

I actual debug for more than one Month on this Problem.

On Multiscreen Environments with NVidia Cards, the Call of
glRenderMode(GL_SELECT);
corrupts the Memory of my Application. When I disable one Monitor,
everything is Fine!

I Tried different Versions of the Driver, including Beta, and the
Development Drivers. All of them have the same behaviour. GL-Expert and some
self-made Checks cant find Problems within the GL Statements.

The only hint I have:
Maybe, the Call of GL_SELECT switches GL to Software rendering while in this
State?

How do I Know that this call corrupts my Memory?
I Call the Memory Validation function between each single Command. Before
this Statement, everything is fine, after that it isn’t.

Is this Problem known?

Best regards, and Thanks your Time,

Thomas

G?ttling Thomas wrote:

Good Day!

I actual debug for more than one Month on this Problem.

On Multiscreen Environments with NVidia Cards, the Call of
glRenderMode(GL_SELECT);
corrupts the Memory of my Application. When I disable one Monitor,
everything is Fine!

I Tried different Versions of the Driver, including Beta, and the
Development Drivers. All of them have the same behaviour. GL-Expert and some
self-made Checks cant find Problems within the GL Statements.

The only hint I have:
Maybe, the Call of GL_SELECT switches GL to Software rendering while in this
State?

How do I Know that this call corrupts my Memory?
I Call the Memory Validation function between each single Command. Before
this Statement, everything is fine, after that it isn’t.

Is this Problem known?

Hello,

First, this is completely off topic here. SDL doesn’t touch the OpenGL
calls at all.

Now, about your issue, I suggest you read the glRenderMode documentation :
http://www.mevis.de/opengl/glRenderMode.html

Especially :

GL_SELECT
Selection mode. No pixel fragments are produced, and no change to the
frame buffer contents is made. Instead, a record of the names of
primitives that would have been drawn if the render mode was GL_RENDER
is returned in a select buffer, which must be created (see
glSelectBuffer http://www.mevis.de/opengl/glSelectBuffer.html)
before selection mode is entered.

Stephane

Hello Stephane

First, this is completely off topic here. SDL doesn’t touch the OpenGL
calls at all.

I don?t think I’m off topic. I think it is a bug within the Handling of the
Open-GL window handler, located in SDL. I Consulted several GL-Forums
before.

As I wrote, GL dosn?t report Errors, and all Statements are Working fine on
ATI or on a NVIDIA single Monitor configuration.

Thanks for the Link, but there is not Problem within the Selection.
Selecting Objects works fine!

But there is something writing to where it shouldn’t within the Call of
glRenderMode(GL_SELECT);

Thomas

Good Day!

Finally I found the Bug. Maybe it is interesting to S.o.

The Call of
glRenderMode(GL_SELECT);
causing a Stack overflow in a Special Condition:

You have to setup a really Large Screen (Multiscreen with 2x 1680
Widescreen), and a small Stack.

Within our Application the Stack has only 100k because we switch between
multiple “selfmade” Stacks.

Stephane was right, it isn’t a Problem of SDL, but this is nowhere
documented. Sorry for That :).

Thomas