SDL: opengl: use GL_UNSIGNED_BYTE instead of GL_UNSIGNED_INT_8_8_8_8_REV.

From 6934c910b3c32273857908988352c0481da7a0b9 Mon Sep 17 00:00:00 2001
From: "Ryan C. Gordon" <[EMAIL REDACTED]>
Date: Tue, 7 Jan 2025 16:08:56 -0500
Subject: [PATCH] opengl: use GL_UNSIGNED_BYTE instead of
 GL_UNSIGNED_INT_8_8_8_8_REV.

This seems to be significantly more efficient on some modern platforms, but if
this turns out to be a widespread disaster, we can revert it.
---
 src/render/opengl/SDL_render_gl.c | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/src/render/opengl/SDL_render_gl.c b/src/render/opengl/SDL_render_gl.c
index fcd7aead219e7..64164a6d8bc36 100644
--- a/src/render/opengl/SDL_render_gl.c
+++ b/src/render/opengl/SDL_render_gl.c
@@ -410,13 +410,13 @@ static bool convert_format(Uint32 pixel_format, GLint *internalFormat, GLenum *f
     case SDL_PIXELFORMAT_XRGB8888:
         *internalFormat = GL_RGBA8;
         *format = GL_BGRA;
-        *type = GL_UNSIGNED_INT_8_8_8_8_REV;
+        *type = GL_UNSIGNED_BYTE; // previously GL_UNSIGNED_INT_8_8_8_8_REV, seeing if this is better in modern times.
         break;
     case SDL_PIXELFORMAT_ABGR8888:
     case SDL_PIXELFORMAT_XBGR8888:
         *internalFormat = GL_RGBA8;
         *format = GL_RGBA;
-        *type = GL_UNSIGNED_INT_8_8_8_8_REV;
+        *type = GL_UNSIGNED_BYTE; // previously GL_UNSIGNED_INT_8_8_8_8_REV, seeing if this is better in modern times.
         break;
     case SDL_PIXELFORMAT_YV12:
     case SDL_PIXELFORMAT_IYUV: