I’ve added a hint to specify the sampling algorithm that’s used when scaling
textures.
Based on feedback I’ve received, this defaults to GL_NEAREST/D3DTEXF_POINT,
which is consistent with the software renderer. You can also specify
"linear" and “best”.
See SDL_hints.h for the full details.
Cheers!–
-Sam Lantinga, Founder and CEO, Galaxy Gameworks
I’ve added a hint to specify the sampling algorithm that’s used when scaling
textures.
Based on feedback I’ve received, this defaults to GL_NEAREST/D3DTEXF_POINT,
which is consistent with the software renderer.? You can also specify
“linear” and “best”.
Thanks, Sam! Now, if only I knew why it’s not working :-(.
I expect it might have to do something with your caching. If I change
GL_RenderCopy and comment out the check for the cached setting, like
so:
everything is fine. My initial thought was that
data->current.scaleMode defaults to 0 and GL_NEAREST is also defined
as 0, but that’s not the case. Whatever it is, here on Ubuntu 10.10
with the 260.19.21 nvidia drivers it seems that those parameters needs
to be set every time to remain effective. Not sure what may cause
this, but I assume something further down below is resetting them to
their default (GL_LINEAR) again. Maybe the call to glDisable, but
that’s only a shot in the dark.
KaiOn Sun, Mar 13, 2011 at 7:26 PM, Sam Lantinga wrote:
It looks fine here. Can you post a link to a small test program?
I’m attaching my test…On Mon, Mar 14, 2011 at 2:42 PM, Kai Sterker <kai.sterker at gmail.com> wrote:
On Sun, Mar 13, 2011 at 7:26 PM, Sam Lantinga <@slouken> wrote:
I’ve added a hint to specify the sampling algorithm that’s used when
scaling
textures.
Based on feedback I’ve received, this defaults to
GL_NEAREST/D3DTEXF_POINT,
which is consistent with the software renderer. You can also specify
“linear” and “best”.
Thanks, Sam! Now, if only I knew why it’s not working :-(.
I expect it might have to do something with your caching. If I change
GL_RenderCopy and comment out the check for the cached setting, like
so:
everything is fine. My initial thought was that
data->current.scaleMode defaults to 0 and GL_NEAREST is also defined
as 0, but that’s not the case. Whatever it is, here on Ubuntu 10.10
with the 260.19.21 nvidia drivers it seems that those parameters needs
to be set every time to remain effective. Not sure what may cause
this, but I assume something further down below is resetting them to
their default (GL_LINEAR) again. Maybe the call to glDisable, but
that’s only a shot in the dark.
It looks fine here.? Can you post a link to a small test program?
Your test worked fine for me as well, but with a slight modification
it shows the issue I’m experiencing.
It seems that the hint only applies to the first texture created
afterwards. Any subsequent textures, even though they do get
GL_NEAREST set for their scaleMode, are rendered with GL_LINEAR
regardless. I’m still under the impression that the caching is at
fault.
Thanks!On Wed, Mar 16, 2011 at 3:22 PM, Kai Sterker <kai.sterker at gmail.com> wrote:
On Wed, Mar 16, 2011 at 3:59 AM, Sam Lantinga <@slouken> wrote:
It looks fine here. Can you post a link to a small test program?
Your test worked fine for me as well, but with a slight modification
it shows the issue I’m experiencing.
It seems that the hint only applies to the first texture created
afterwards. Any subsequent textures, even though they do get
GL_NEAREST set for their scaleMode, are rendered with GL_LINEAR
regardless. I’m still under the impression that the caching is at
fault.
Kai
–
-Sam Lantinga, Founder and CEO, Galaxy Gameworks