SDL 2.0 Surface to OpenGL Texture

I know this has been done over and over on the mailing list, and if this is
a more appropriate question for OpenGL, I’ll ask over at gamedev.

So, here is my relevant code:

SDL_Surface* newSurface = SDL_CreateRGBSurface(

0, w, w, 24,
0xff000000, 0x00ff0000, 0x0000ff00, 0
);
SDL_BlitSurface( this->m_surface, 0, newSurface, 0);
glGenTextures( 1, &this->m_texture );
glBindTexture( GL_TEXTURE_2D, this->m_texture );
glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24/3/, w, w, 0,
GL_RGB, GL_UNSIGNED_BYTE, newSurface->pixels );
SDL_FreeSurface( newSurface );

And here is hope I’m displaying it:

glBegin( GL_QUADS );

glTexCoord2i( 0, 0 );
glVertex3f( this->m_bounds.x, this->m_bounds.y - this->m_bounds.h, 0.0f );

glTexCoord2i( 1, 0 );
glVertex3f( this->m_bounds.x + this->m_bounds.w, this->m_bounds.y -
this->m_bounds.h, 0.f );

glTexCoord2i( 1, 1 );
glVertex3f( this->m_bounds.x + this->m_bounds.w, this->m_bounds.y, 0.f );

glTexCoord2i( 0, 1 );
glVertex3f( this->m_bounds.x, this->m_bounds.y, 0.f );
glEnd();

All I’m getting is a white box, meanwhile the surfaces I’ve converting to
opengl textures are text strings using SDL_ttf, and they are giving valid
surfaces.

Also, here is the relevant OpenGL setup code:

glClearColor( 0, 0, 0, 0 );

glViewport( 0, 0, 800, 600 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity( );
glOrtho(0, 800, 0, 600, 0, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glScissor( 0, 0, 800, 600 );
glEnable(GL_SCISSOR_TEST);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glLineWidth( 2.0f );
glEnable(GL_LINE_SMOOTH);
glEnable(GL_BLEND);

Any help would be appreciated,
-Alex

I know this has been done over and over on the mailing list, and if this
is a more appropriate question for OpenGL, I’ll ask over at gamedev.

So, here is my relevant code:

SDL_Surface* newSurface = SDL_CreateRGBSurface(

0, w, w, 24,
0xff000000, 0x00ff0000, 0x0000ff00, 0
);
SDL_BlitSurface( this->m_surface, 0, newSurface, 0);
glGenTextures( 1, &this->m_texture );
glBindTexture( GL_TEXTURE_2D, this->m_texture );
glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24/3/, w, w, 0,
GL_RGB, GL_UNSIGNED_BYTE, newSurface->pixels );
SDL_FreeSurface( newSurface );

And here is hope I’m displaying it:

glBegin( GL_QUADS );

glTexCoord2i( 0, 0 );
glVertex3f( this->m_bounds.x, this->m_bounds.y - this->m_bounds.h, 0.0f );

glTexCoord2i( 1, 0 );
glVertex3f( this->m_bounds.x + this->m_bounds.w, this->m_bounds.y -
this->m_bounds.h, 0.f );

glTexCoord2i( 1, 1 );
glVertex3f( this->m_bounds.x + this->m_bounds.w, this->m_bounds.y, 0.f );

glTexCoord2i( 0, 1 );
glVertex3f( this->m_bounds.x, this->m_bounds.y, 0.f );
glEnd();

All I’m getting is a white box, meanwhile the surfaces I’ve converting to
opengl textures are text strings using SDL_ttf, and they are giving valid
surfaces.

Also, here is the relevant OpenGL setup code:

glClearColor( 0, 0, 0, 0 );

glViewport( 0, 0, 800, 600 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity( );
glOrtho(0, 800, 0, 600, 0, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glScissor( 0, 0, 800, 600 );
glEnable(GL_SCISSOR_TEST);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glLineWidth( 2.0f );
glEnable(GL_LINE_SMOOTH);
glEnable(GL_BLEND);

Any help would be appreciated,
-Alex


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Try changing format GL_RGB to GL_RGBA in this line:

glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24/3/, w, w, 0, GL_RGB,
GL_UNSIGNED_BYTE, newSurface->pixels );On Tue, May 15, 2012 at 12:29 AM, Alex Barry <alex.barry at gmail.com> wrote:

I know this has been done over and over on the mailing list, and if this
is a more appropriate question for OpenGL, I’ll ask over at gamedev.

So, here is my relevant code:

SDL_Surface* newSurface = SDL_CreateRGBSurface(

0, w, w, 24,
0xff000000, 0x00ff0000, 0x0000ff00, 0
);
SDL_BlitSurface( this->m_surface, 0, newSurface, 0);
glGenTextures( 1, &this->m_texture );
glBindTexture( GL_TEXTURE_2D, this->m_texture );
glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24/3/, w, w, 0,
GL_RGB, GL_UNSIGNED_BYTE, newSurface->pixels );
SDL_FreeSurface( newSurface );

The second parameter to glTexImage2D is wrong if you’re trying to display
color. You’ll get grayscale/alpha. Try GL_RGB8 or GL_RGB.

Second, you didn’t specify any mipmaps but the texture filter mode defaults
to GL_NEAREST_MIPMAP_LINEAR. When you have specified a texture but no
mipmaps and use a texturing filtering mode that uses mipmaps, OpenGL calls
the texture “incomplete” and treats it as if you had no texture at all. Try
adding this after you bind the texture to use bilinear filtering with no
mipmaps:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

Unlike in D3D, textures, not the texture units, carry the parameters like
filtering/repeat mode, so you’ll need to set this for every texture. If you
want mipmaps and have OpenGL v1.4 or later, try:

glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);

And here is hope I’m displaying it:

glBegin( GL_QUADS );

glTexCoord2i( 0, 0 );
glVertex3f( this->m_bounds.x, this->m_bounds.y - this->m_bounds.h, 0.0f );

glTexCoord2i( 1, 0 );
glVertex3f( this->m_bounds.x + this->m_bounds.w, this->m_bounds.y -
this->m_bounds.h, 0.f );

glTexCoord2i( 1, 1 );
glVertex3f( this->m_bounds.x + this->m_bounds.w, this->m_bounds.y, 0.f );

glTexCoord2i( 0, 1 );
glVertex3f( this->m_bounds.x, this->m_bounds.y, 0.f );
glEnd();

All I’m getting is a white box, meanwhile the surfaces I’ve converting to
opengl textures are text strings using SDL_ttf, and they are giving valid
surfaces.

Also, here is the relevant OpenGL setup code:

glClearColor( 0, 0, 0, 0 );

glViewport( 0, 0, 800, 600 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity( );
glOrtho(0, 800, 0, 600, 0, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glScissor( 0, 0, 800, 600 );
glEnable(GL_SCISSOR_TEST);

Just a quick note: scissor tests aren’t free. If you have viewport ==
scissor, then you’re effectively doing a no-op but paying a price. The area
in the scissor test should always be a proper subset in the viewport’s area.

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glLineWidth( 2.0f );
glEnable(GL_LINE_SMOOTH);
glEnable(GL_BLEND);

I don’t see glEnable(GL_TEXTURE_2D) here – that’s also required.

Let me know if that fixes the issue. Good luck.

PatrickOn Mon, May 14, 2012 at 11:29 PM, Alex Barry <alex.barry at gmail.com> wrote:

Any help would be appreciated,
-Alex


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Here’s the code that I use with SDL2 to convert a surface into texture.
Note the lines that disable blending for source surface which seem to be
necessary when the original image surface has an alpha channel.

Code:

static void
surface_to_texture(SDL_Surface *img, unsigned *w, unsigned h)
{
/
OpenGL pixel format for destination surface. */
int bpp;
Uint32 Rmask, Gmask, Bmask, Amask;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_ABGR8888, &bpp, &Rmask,
&Gmask, &Bmask, &Amask);

    /* Create surface that will hold pixels passed into OpenGL. */
    SDL_Surface *img_rgba8888 = SDL_CreateRGBSurface(0, img->w, img->h, bpp,
                                                     Rmask, Gmask, Bmask, Amask);
    
    /*
     * Disable blending for source surface. If this is not done, all
     * destination surface pixels end up with crazy alpha values.
     */
    SDL_SetSurfaceAlphaMod(img, 0xFF);
    SDL_SetSurfaceBlendMode(img, SDL_BLENDMODE_NONE);
    
    /* Blit to this surface, effectively converting the format. */
    SDL_BlitSurface(img, NULL, img_rgba8888, NULL);

    /* Store width and height as return values. */
    assert(w && h);
    *w = img->w;
    *h = img->h;
    unsigned pow_w = nearest_pow2(img->w);
    unsigned pow_h = nearest_pow2(img->h);
    
    /*
     * Create a blank texture with power-of-two dimensions. Then load
     * converted image data into its lower left.
     */
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pow_w, pow_h, 0,
                 GL_RGBA, GL_UNSIGNED_BYTE, NULL);
    glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, img->w, img->h, GL_RGBA,
                    GL_UNSIGNED_BYTE, img_rgba8888->pixels);
    
    SDL_FreeSurface(img_rgba8888);

}

Still no luck.
I added the glEnable for the texture (As Patrick suggested), which I
thought was possibly the biggest issue - didn’t fix the immediate problem,
but I’m sure it was contributing.

Here is the new code for binding my texture:

void Surface::openglBindTexture( void )

{
assert( this->m_surface != NULL );
int glerror = 0;
if( !this->m_hastexture ) {
int w = (int)std::pow(2, std::ceil( std::log((float)this->m_surface->w) /
std::log(2.0f) ) );
int bpp;
Uint32 Rmask, Gmask, Bmask, Amask;
SDL_PixelFormatEnumToMasks(
/SDL_PIXELFORMAT_ABGR8888/ SDL_PIXELFORMAT_RGBA8888, &bpp,
&Rmask, &Gmask, &Bmask, &Amask
);

/* Create surface that will hold pixels passed into OpenGL. */
SDL_Surface *img_rgba8888 = SDL_CreateRGBSurface(0,
this->m_surface->w, this->m_surface->h, bpp,
Rmask, Gmask, Bmask, Amask
);
SDL_SetSurfaceAlphaMod( this->m_surface, 0xFF );
SDL_SetSurfaceBlendMode( this->m_surface, SDL_BLENDMODE_NONE );
SDL_BlitSurface( this->m_surface, NULL, img_rgba8888, NULL );
glGenTextures( 1, &this->m_texture );
glBindTexture( GL_TEXTURE_2D, this->m_texture );
//glTexImage2D( GL_TEXTURE_2D, 0, /GL_DEPTH_COMPONENT24/3, w, w, 0,
GL_RGBA, GL_UNSIGNED_BYTE, this->m_surface->pixels );
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, w, w, 0, GL_RGBA,
GL_UNSIGNED_BYTE, NULL );
glerror = GLException::glError(); if( glerror != 0 ) throw new
GLException( “Surface::openglBindTexture::glTexImage2D”, glerror, FILE,
LINE );
glTexSubImage2D( GL_TEXTURE_2D, 0, 0, 0, this->m_surface->w,
this->m_surface->h, GL_RGBA, GL_UNSIGNED_BYTE, img_rgba8888->pixels ); //
!!! Throws an Exception !!!
glerror = GLException::glError(); if( glerror != 0 ) throw new
GLException( “Surface::openglBindTexture::glTexSubImage2D”, glerror,
FILE, LINE );

SDL_FreeSurface(img_rgba8888);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); // Linear
Filtering
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); // Linear
Filtering
glerror = GLException::glError();
if( glerror != 0 ) throw new GLException(
“Surface::openglBindTexture::Importing raw texture”, glerror, FILE,
LINE );
this->m_hastexture = true;
}
glBindTexture( GL_TEXTURE_2D, this->m_texture );
}

Also, here is my updated opengl setup code:

void Application::glSetup( void )

{
glClearColor( 0, 0, 0, 0 );
glViewport( 0, 0, 800, 600 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity( );
glOrtho(0, 800, 0, 600, 0, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glLineWidth( 2.0f );
glEnable(GL_LINE_SMOOTH);
glEnable(GL_BLEND);
glEnable(GL_TEXTURE_2D);
}

I decided to open add a new window to test my surface just for a sanity
check - everything looks perfect in the surface, so the issue is definitely
in converting the surface to an OpenGL texture.

I wrote my own exception stuff, and I’m getting an opengl error on
my glTexSubImage2D call, and glGetError() gives me GL_INVALID_VALUE

I’ve tried going over all my stuff, tried a few different values (see
my SDL_PixelFormatEnumToMasks call), and I either get a white or a
blue-green box, but not my texture :frowning:

Sorry to keep bothering you gents with this, again, I can move it to
gamedev if it’s more appropriate there.

Also, thanks Atis for that code, I feel like it was a step or two in the
right direction. And to Patrick and Chris for advice and ideas.

-AlexOn Tue, May 15, 2012 at 10:17 AM, Atis wrote:

**
Here’s the code that I use with SDL2 to convert a surface into texture.
Note the lines that disable blending for source surface which seem to be
necessary when the original image surface has an alpha channel.

Code:

static void
surface_to_texture(SDL_Surface *img, unsigned *w, unsigned h)
{
/
OpenGL pixel format for destination surface. */
int bpp;
Uint32 Rmask, Gmask, Bmask, Amask;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_ABGR8888, &bpp, &Rmask,
&Gmask, &Bmask, &Amask);

    /* Create surface that will hold pixels passed into OpenGL. */
    SDL_Surface *img_rgba8888 = SDL_CreateRGBSurface(0, img->w,

img->h, bpp,
Rmask, Gmask,
Bmask, Amask);

    /*
     * Disable blending for source surface. If this is not done, all
     * destination surface pixels end up with crazy alpha values.
     */
    SDL_SetSurfaceAlphaMod(img, 0xFF);
    SDL_SetSurfaceBlendMode(img, SDL_BLENDMODE_NONE);

    /* Blit to this surface, effectively converting the format. */
    SDL_BlitSurface(img, NULL, img_rgba8888, NULL);

    /* Store width and height as return values. */
    assert(w && h);
    *w = img->w;
    *h = img->h;
    unsigned pow_w = nearest_pow2(img->w);
    unsigned pow_h = nearest_pow2(img->h);

    /*
     * Create a blank texture with power-of-two dimensions. Then load
     * converted image data into its lower left.
     */
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pow_w, pow_h, 0,
                 GL_RGBA, GL_UNSIGNED_BYTE, NULL);
    glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, img->w, img->h, GL_RGBA,
                    GL_UNSIGNED_BYTE, img_rgba8888->pixels);

    SDL_FreeSurface(img_rgba8888);

}


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Is any of this necessarily specific to SDL2? The gpwiki page on loading an
SDL_Surface to an OpenGL texture works for SDL1.2:
http://content.gpwiki.org/index.php/SDL:Tutorials:Using_SDL_with_OpenGL

Jonny D

Still no luck.
I added the glEnable for the texture (As Patrick suggested), which I
thought was possibly the biggest issue - didn’t fix the immediate problem,
but I’m sure it was contributing.

Here is the new code for binding my texture:

void Surface::openglBindTexture( void )

{
assert( this->m_surface != NULL );
int glerror = 0;
if( !this->m_hastexture ) {
int w = (int)std::pow(2, std::ceil( std::log((float)this->m_surface->w) /
std::log(2.0f) ) );

int bpp;
Uint32 Rmask, Gmask, Bmask, Amask;
SDL_PixelFormatEnumToMasks(
/SDL_PIXELFORMAT_ABGR8888/ SDL_PIXELFORMAT_RGBA8888, &bpp,
&Rmask, &Gmask, &Bmask, &Amask

);

/* Create surface that will hold pixels passed into OpenGL. */
SDL_Surface *img_rgba8888 = SDL_CreateRGBSurface(0,
this->m_surface->w, this->m_surface->h, bpp,

Rmask, Gmask, Bmask, Amask
);
SDL_SetSurfaceAlphaMod( this->m_surface, 0xFF );
SDL_SetSurfaceBlendMode( this->m_surface, SDL_BLENDMODE_NONE );
SDL_BlitSurface( this->m_surface, NULL, img_rgba8888, NULL );

glGenTextures( 1, &this->m_texture );
glBindTexture( GL_TEXTURE_2D, this->m_texture );
//glTexImage2D( GL_TEXTURE_2D, 0, /GL_DEPTH_COMPONENT24/3, w, w, 0,
GL_RGBA, GL_UNSIGNED_BYTE, this->m_surface->pixels );
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, w, w, 0, GL_RGBA,
GL_UNSIGNED_BYTE, NULL );
glerror = GLException::glError(); if( glerror != 0 ) throw new
GLException( “Surface::openglBindTexture::glTexImage2D”, glerror, FILE,
LINE );
glTexSubImage2D( GL_TEXTURE_2D, 0, 0, 0, this->m_surface->w,
this->m_surface->h, GL_RGBA, GL_UNSIGNED_BYTE, img_rgba8888->pixels ); //
!!! Throws an Exception !!!
glerror = GLException::glError(); if( glerror != 0 ) throw new
GLException( “Surface::openglBindTexture::glTexSubImage2D”, glerror,
FILE, LINE );

The glTexImage2D( …, NULL); glTexSubImage2D( … ); pattern isn’t
required. You can do it all in one step in your case. Seeing as how you’re
going GL_INVALID_VALUE, that tells me your width and height are likely
wrong. They need to be a power of 2*.

*GL_ARB_texture_non_power_of_two relaxes this.
http://www.opengl.org/registry/specs/ARB/texture_non_power_of_two.txtOn Tue, May 15, 2012 at 10:21 AM, Alex Barry <alex.barry at gmail.com> wrote:

SDL_FreeSurface(img_rgba8888);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); // Linear
Filtering
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); //
Linear Filtering
glerror = GLException::glError();
if( glerror != 0 ) throw new GLException(
“Surface::openglBindTexture::Importing raw texture”, glerror, FILE,
LINE );
this->m_hastexture = true;

}
glBindTexture( GL_TEXTURE_2D, this->m_texture );
}

Also, here is my updated opengl setup code:

void Application::glSetup( void )

{
glClearColor( 0, 0, 0, 0 );
glViewport( 0, 0, 800, 600 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity( );
glOrtho(0, 800, 0, 600, 0, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glLineWidth( 2.0f );
glEnable(GL_LINE_SMOOTH);
glEnable(GL_BLEND);
glEnable(GL_TEXTURE_2D);
}

I decided to open add a new window to test my surface just for a sanity
check - everything looks perfect in the surface, so the issue is definitely
in converting the surface to an OpenGL texture.

I wrote my own exception stuff, and I’m getting an opengl error on
my glTexSubImage2D call, and glGetError() gives me GL_INVALID_VALUE

I’ve tried going over all my stuff, tried a few different values (see
my SDL_PixelFormatEnumToMasks call), and I either get a white or a
blue-green box, but not my texture :frowning:

Sorry to keep bothering you gents with this, again, I can move it to
gamedev if it’s more appropriate there.

Also, thanks Atis for that code, I feel like it was a step or two in the
right direction. And to Patrick and Chris for advice and ideas.

-Alex

On Tue, May 15, 2012 at 10:17 AM, Atis wrote:

**
Here’s the code that I use with SDL2 to convert a surface into texture.
Note the lines that disable blending for source surface which seem to be
necessary when the original image surface has an alpha channel.

Code:

static void
surface_to_texture(SDL_Surface *img, unsigned *w, unsigned h)
{
/
OpenGL pixel format for destination surface. */
int bpp;
Uint32 Rmask, Gmask, Bmask, Amask;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_ABGR8888, &bpp, &Rmask,
&Gmask, &Bmask, &Amask);

    /* Create surface that will hold pixels passed into OpenGL. */
    SDL_Surface *img_rgba8888 = SDL_CreateRGBSurface(0, img->w,

img->h, bpp,
Rmask, Gmask,
Bmask, Amask);

    /*
     * Disable blending for source surface. If this is not done, all
     * destination surface pixels end up with crazy alpha values.
     */
    SDL_SetSurfaceAlphaMod(img, 0xFF);
    SDL_SetSurfaceBlendMode(img, SDL_BLENDMODE_NONE);

    /* Blit to this surface, effectively converting the format. */
    SDL_BlitSurface(img, NULL, img_rgba8888, NULL);

    /* Store width and height as return values. */
    assert(w && h);
    *w = img->w;
    *h = img->h;
    unsigned pow_w = nearest_pow2(img->w);
    unsigned pow_h = nearest_pow2(img->h);

    /*
     * Create a blank texture with power-of-two dimensions. Then load
     * converted image data into its lower left.
     */
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pow_w, pow_h, 0,
                 GL_RGBA, GL_UNSIGNED_BYTE, NULL);
    glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, img->w, img->h, GL_RGBA,
                    GL_UNSIGNED_BYTE, img_rgba8888->pixels);

    SDL_FreeSurface(img_rgba8888);

}


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

That’s actually the code I started with, and didn’t have success. =/On Tue, May 15, 2012 at 11:31 AM, Jonathan Dearborn wrote:

Is any of this necessarily specific to SDL2? The gpwiki page on loading
an SDL_Surface to an OpenGL texture works for SDL1.2:
Game Programming Wiki - GPWiki

Jonny D


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Well, I adapted that for SDL_gpu (http://code.google.com/p/sdl-gpu/) and
got it to work. Maybe you can look through the SDL_gpu source? Check
out OpenGL/SDL_gpu_OpenGL.c, specifically Init(), CopyImageFromSurface(),
and Blit(), though Blit() is a little obfuscated by features. I’m also
assuming NPOT texture support.

Jonny D

Thanks gents, I got this working, and it turns out most of my issue was
that I wasn’t using proper coordinates for aligning the texture on my gl
quad vertices. Anyway, I fixed that up, and it’s pretty awesome.

Anyone interested in the code can peek at it here:

SDL::Surface is the class you’ll want to prod at, and the openglBindTexture
method would be the exacting place to look.

Thanks for all the help gents, you were super.

-AlexOn Tue, May 15, 2012 at 12:04 PM, Jonathan Dearborn wrote:

Well, I adapted that for SDL_gpu (Google Code Archive - Long-term storage for Google Code Project Hosting.) and
got it to work. Maybe you can look through the SDL_gpu source? Check
out OpenGL/SDL_gpu_OpenGL.c, specifically Init(), CopyImageFromSurface(),
and Blit(), though Blit() is a little obfuscated by features. I’m also
assuming NPOT texture support.

Jonny D


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org