Problem with SDL_BlitSurface

Hello. I’m new to SDL and I’m trying to load a font from a BMP and split
it into individual characters to be used as textures in OpenGL. The BMP
has one monospace character after another and loads fine, but when I try
to split it up by creating a surface for each character then blitting
from the big bitmap onto each small bitmap, SDL_BlitSurface returns -1.
Below is the code I’m using. If anyone can help me out it would be much
appreciated. The “cfg” struct holds basic configuration information and
the members should be self explanatory. Thanks.

SDL_Surface** LoadFont(char* filename, int char_w, int char_h)
throw (std::bad_alloc)
SDL_Surface* all = SDL_LoadBMP(filename);
if (!all) {
return 0;

SDL_Surface** font = new SDL_Surface*[NUM_FONT_CHARS];

Uint32 flags = cfg.has_hw_surfaces ? SDL_HWSURFACE : SDL_SWSURFACE;
Uint32 rmask;
Uint32 gmask;
Uint32 bmask;
Uint32 amask;

rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;

for (int i = 0; i < NUM_FONT_CHARS; i++) {
	font[i] = SDL_CreateRGBSurface(flags, char_w, char_h,
			cfg.bpp, rmask, gmask, bmask, amask);
	if (!font[i]) {
		// FIXME: clean up
		return 0;

SDL_Rect dstrect = { 0, 0, char_w, char_h };
SDL_Rect srcrect = { 0, 0, char_w, char_h };

for (int i = 0; i < NUM_FONT_CHARS; i++) {
	int ret;
	if ((ret = SDL_BlitSurface(all, &srcrect, font[i], &dstrect))
			< 0)
		printf("blit failed: ret=%d i=%d", ret, i);
	srcrect.x += char_w;

return font;


Sorry folks, I have found the problem. It was a stupid mistake on my
part. Sorry to bother you.