Hi
how to create textures for Opengl with help of SDL?
the following code work not
but the odd thing is if I replace the glTexImage2D with gluBuild2DMipmaps
it works, but tehn only the last loaded texture is on everything, it’s always
bound even if gl_BindTexture is not called!
Please help me!
gload_texture(&texture[0], “texture.tga”);
void gload_texture(GLuint *texture, const char filename)
{
string ext = strrchr(filename, ‘.’);
SDL_Surface image=IMG_Load(filename);
if(!image)
{
cout<<"failed to load texture \""<<filename<<endl;
return;
}
int width=image->w;
int height=image->h;
unsigned char *data=(unsigned char *)(image->pixels);
int BytesPerPixel=(image->format->BytesPerPixel);
for(int i=0; i<(height/2); ++i)
for(int j=0; j<width*BytesPerPixel; j+=BytesPerPixel)
for(int k=0; k<BytesPerPixel; ++k)
swap(data[(i*width*BytesPerPixel)+j+k], data[((height-i-1)*width*BytesPerPixel)+j+k]);
glGenTextures(1, texture);
glBindTexture(GL_TEXTURE_2D, *texture);
if(ext==".tga" || ext==".png" || ext==".xfc" || ext==".tif")
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);
else
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, image->w, image->h, 0, GL_RGB, GL_UNSIGNED_BYTE, image->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
SDL_FreeSurface(image);
}
btw, how to test if an image has alpha?
Thank in advance–
funthing
Hi
how to create textures for Opengl with help of SDL?
the following code work not
but the odd thing is if I replace the glTexImage2D with gluBuild2DMipmaps
it works, but tehn only the last loaded texture is on everything, it’s always
bound even if gl_BindTexture is not called!
Please help me!
Of course it shouldn’t work. In glTexImage2D you can’t give width and
height which is not 2^n (or +1 if using borders) Read OpenGL tutorials or
redbook.
How to determine if surface have alfa channel? I am checking
SDL_Surface->format->Amask (maybe it is wrong but it is functional for me
purposes)
KrataOn Sat, 6 Jul 2002, Matthias wrote:
gload_texture(&texture[0], “texture.tga”);
void gload_texture(GLuint *texture, const char filename)
{
string ext = strrchr(filename, ‘.’);
SDL_Surface image=IMG_Load(filename);
if(!image)
{
cout<<“failed to load texture "”<<filename<<endl;
return;
}
int width=image->w;
int height=image->h;
unsigned char *data=(unsigned char *)(image->pixels);
int BytesPerPixel=(image->format->BytesPerPixel);
for(int i=0; i<(height/2); ++i)
for(int j=0; j<widthBytesPerPixel; j+=BytesPerPixel)
for(int k=0; k<BytesPerPixel; ++k)
swap(data[(iwidth*BytesPerPixel)+j+k], data[((height-i-1)widthBytesPerPixel)+j+k]);
glGenTextures(1, texture);
glBindTexture(GL_TEXTURE_2D, *texture);
if(ext==“.tga” || ext==“.png” || ext==“.xfc” || ext==“.tif”)
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);
else
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, image->w, image->h, 0, GL_RGB, GL_UNSIGNED_BYTE, image->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
SDL_FreeSurface(image);
}
btw, how to test if an image has alpha?
Thank in advance
funthing
SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl
Hi folks!
SDL works great, but I want’t know if the following thing is justified?
SDL_GL_SwapBuffers pulls down the framerate on a remarkable hunk!
idle (only fps_counter): ~ 34 000 FPS (I know here we don’t any fps, but loop per second)
- SDL_GL_SwapBuffers(): ~ 1 700 FPS
I this right?
if not here are some specs:
screen = SDL_SetVideoMode(1024, 768, 32, SDL_OPENGL|SDL_RESIZABLE );
Linux, NVIDIA_glx, 1.4 GHZ, GF4Ti
Are there any options to add like SDL_SetVideoMode(x,x,x,…?);
or SDL_GL_SetAttribute(…?);
thx, in advance!–
funthing
When in a window, GL_SwapBuffers usually has to do actual blits, which
takes some time. Try it in fullscreen, with vsync disabled. (I don’t
think SDL gives any way to set vsync, though, and I don’t know how to
set it with glx.)On Sun, Dec 08, 2002 at 09:11:36PM +0100, Matthias wrote:
SDL works great, but I want’t know if the following thing is justified?
SDL_GL_SwapBuffers pulls down the framerate on a remarkable hunk!
idle (only fps_counter): ~ 34 000 FPS (I know here we don’t any fps, but loop per second)
- SDL_GL_SwapBuffers(): ~ 1 700 FPS
I this right?
if not here are some specs:
screen = SDL_SetVideoMode(1024, 768, 32, SDL_OPENGL|SDL_RESIZABLE );
Linux, NVIDIA_glx, 1.4 GHZ, GF4Ti
Are there any options to add like SDL_SetVideoMode(x,x,x,…?);
or SDL_GL_SetAttribute(…?);
–
Glenn Maynard