Dear SDL affectionados,
I have the following function which is called whenever I load an SDL
image called LoadImage. Basically, this uses SDL_IMAGE to load an image
and return the pointer. I then pass this new surface pointer to
OptimizeImage, which converts the image to the display depth. However,
when I do so it is often slower by a considerable margin. This is with
the latest .10 release of SDL. Can someone tell me why this might occur?
Function LoadImage:Byte Ptr( file:String , blend:Int )
Local img:Byte Ptr = IMG_Load( file )
img = OptimizeImage( img , blend )
Return img
End Function
Function OptimizeImage:Byte Ptr(image:Byte Ptr,mode:Int)
Select mode
Case ALPHABLEND 'Convert with all the trimmings
Local tempImage:Byte Ptr = SDL_DisplayFormatAlpha(image)
SDL_FreeSurface(image)
sw.SetBlend(tempImage,ALPHABLEND)
Return tempImage
Case MASKBLEND 'Convert with solid
Local tempImage:Byte Ptr = SDL_DisplayFormat(image)
SDL_FreeSurface(image)
sw.SetBlend(tempImage,MASKBLEND)
Return tempImage
Case SOLIDBLEND 'Convert with solid
Local tempImage:Byte Ptr = SDL_DisplayFormat(image)
SDL_FreeSurface(image)
sw.SetBlend(tempImage,SOLIDBLEND)
Return tempImage
End Select
(note: SetBlend often slows things down too. I took Gabriel Gambetta’s
advice to call SDL_SetAlpha before drawing but that slows stuff down too!)
Function SetBlend ( image:Byte Ptr , mode:Int )
If mode = ALPHABLEND
SDL_SetAlpha( image , SDL_SRCALPHA | SDL_RLEACCEL , 255)
Else
SDL_SetAlpha( image , SDL_RLEACCEL , 255)
EndIf
End Function