I’ve been trying for hours now to figure out why this piece of code is giving me seg fault, but can’t figure out why.
void pixel::Texture::LoadTexture(const char* filepath)
{
SDL_Surface* image = IMG_Load(filepath);
SDL_Surface* formattedImage = SDL_ConvertSurfaceFormat(image, SDL_PIXELFORMAT_ARGB8888, 0);
textureBuffer = new Uint32[formattedImage->pitch * formattedImage->h];
width = formattedImage->w;
height = formattedImage->h;
pitch = formattedImage->pitch;
SDL_LockSurface(formattedImage);
Uint32* pixels = (Uint32*)formattedImage->pixels;
int bpp = formattedImage->format->BytesPerPixel;
for(int y = 0; y < height; y++)
for(int x = 0; x < width; x++)
textureBuffer[(formattedImage->pitch * y) + (x * bpp)] = pixels[(formattedImage->pitch * y) + (x * bpp)];
SDL_UnlockSurface(formattedImage);
SDL_FreeSurface(image);
SDL_FreeSurface(formattedImage);
}
Essentially, what I’m doing is I have an array of unsigned 32 bit integers called the texture buffer. All I do is load an image into a surface then convert it’s pixel format to ARGB then start reading each pixel and storing it inside my texture buffer array but for some reason the pixels array gets an out of bounds error at some point in runtime and I can’t figure out exactly why.