SDL_opengl Shows Linker Error. (Clang++ / M1)

I just got my MacBook m1 yesterday, and I’m trying to port my projects I made for windows on Mac.
With may trials and research, I was able to build projects using SDL and SDL_image on M1 MacBook.
However, I wasn’t able to build projects with SDL_opengl.h and SDL_opengl_glext.
By the way, this is my first time using Clang and VSCode, so I could doing something wrong on setups, so I will attach my commands.


Here is the error I got.

cd $dir && clang++ -std=c++17 *.cpp -arch x86_64 -I/Library/Frameworks/SDL2.framework/Headers -I/Library/Frameworks/SDL2_image.framework/Headers -F/Library/Frameworks -framework SDL2 -framework SDL2_image -o $fileNameWithoutExt && $dir$fileNameWithoutExt
and this is my Clang++ command on code runner extension in Visual Studio Code.

Try adding -framework Foundation and/or -framework OpenGL

Thank you! It works!!
But I got a new error. I tried to find a solution for this one, but I wasn’t able to.
Do you know how to fix this one? I guess Mac does not support glsl version 330…

Personally no, but Google found this which looks relevant. It suggests you need:

SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3); SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2); SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);

How are you setting up your OpenGL context?

macOS Monterey absolutely supports OpenGL 3.3. Supports all the way up to OpenGL 4.1 AFAIK. Make sure you’re actually requesting (and getting!) an OpenGL 3.3 graphics context.

Here’s how my test program did it:

	SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
	SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3);
	SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);
	// This is important on macOS
	SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);
	// specify depth buffer, etc

Assuming you aren’t hitting a driver bug specific to M1 macs…

Still I see this error…
What would be the problem…


This is how I set openGL up.

Where are you actually getting an OpenGL context from? I see you’re setting some OpenGL attributes, but not creating the context.

Also, why are you setting the SDL_Renderer driver hint?

Somewhere after you’ve initialized OpenGL but before you load your shaders, print the GL_VERSION string, something like:

const char *glVersion = glGetString(GL_VERSION);
std::cout << "GL_VERSION: " << glVersion << std::endl;

Does it say you have an OpenGL 3.3 (or later) context?

Uhh…


2.1 Metal… Oh God. What’s going on.

You have an OpenGL 2.1 context (the part about Metal just means that Apple’s OpenGL driver is implemented with Metal), not an OpenGL 3.3 one.

So, how are you creating that OpenGL context? In the code you posted earlier you set some GL attributes, but it has nothing about how you’re actually creating the context.

Taking another look at the error messages in your first post, it looks like you’re trying to use SDL_Renderer but with custom shaders.

Since SDL_Renderer's OpenGL backend is written for OpenGL 2.1, so it creates an OpenGL 2.1 context, and that’s why your shaders aren’t working. The #version 330 line says the shader is GLSL 3.3, which is for OpenGL 3.3. If it worked on Windows, it’s probably because of an overly permissive driver, whereas Apple’s OpenGL driver is more strict.

#include "Shader.h"

void Shader::init(SDL_Renderer* renderer) {
	glCreateShader = (PFNGLCREATESHADERPROC)SDL_GL_GetProcAddress("glCreateShader");
	glShaderSource = (PFNGLSHADERSOURCEPROC)SDL_GL_GetProcAddress("glShaderSource");
	glCompileShader = (PFNGLCOMPILESHADERPROC)SDL_GL_GetProcAddress("glCompileShader");
	glGetShaderiv = (PFNGLGETSHADERIVPROC)SDL_GL_GetProcAddress("glGetShaderiv");
	glGetShaderInfoLog = (PFNGLGETSHADERINFOLOGPROC)SDL_GL_GetProcAddress("glGetShaderInfoLog");
	glDeleteShader = (PFNGLDELETESHADERPROC)SDL_GL_GetProcAddress("glDeleteShader");
	glAttachShader = (PFNGLATTACHSHADERPROC)SDL_GL_GetProcAddress("glAttachShader");
	glCreateProgram = (PFNGLCREATEPROGRAMPROC)SDL_GL_GetProcAddress("glCreateProgram");
	glLinkProgram = (PFNGLLINKPROGRAMPROC)SDL_GL_GetProcAddress("glLinkProgram");
	glValidateProgram = (PFNGLVALIDATEPROGRAMPROC)SDL_GL_GetProcAddress("glValidateProgram");
	glGetProgramiv = (PFNGLGETPROGRAMIVPROC)SDL_GL_GetProcAddress("glGetProgramiv");
	glGetProgramInfoLog = (PFNGLGETPROGRAMINFOLOGPROC)SDL_GL_GetProcAddress("glGetProgramInfoLog");
	glUseProgram = (PFNGLUSEPROGRAMPROC)SDL_GL_GetProcAddress("glUseProgram");
	glUniform1i = (PFNGLUNIFORM1IPROC)SDL_GL_GetProcAddress("glUniform1i");
	glUniform1f = (PFNGLUNIFORM1FPROC)SDL_GL_GetProcAddress("glUniform1f");
	glGetUniformLocation = (PFNGLGETUNIFORMLOCATIONPROC)SDL_GL_GetProcAddress("glGetUniformLocation");
	glActiveTexture = (PFNGLACTIVETEXTUREPROC)SDL_GL_GetProcAddress("glActiveTexture");
	glTexStorage3D = (PFNGLTEXSTORAGE3DPROC)SDL_GL_GetProcAddress("glTexStorage3D");

	lut = IMG_LoadTexture(renderer, "data/shader/image/lut.png");
	light = IMG_LoadTexture(renderer, "data/shader/image/light.png");
	border = IMG_LoadTexture(renderer, "data/shader/image/border.png");
	sunlight = IMG_LoadTexture(renderer, "data/shader/image/sunlight.png");

	loadShader("data/shader/shader.vs", "data/shader/1.fs");

	SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
	SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3);
	SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);
	SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);

	std::cout << "GL Version : " << (const char*)glGetString(GL_VERSION) << std::endl;
	shaderBuffer = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, 1920, 1080);
}

void Shader::loadShader(std::string vtxFile, std::string fragFile) {
	GLuint vtxShaderId, fragShaderId;

	programId = glCreateProgram();

	std::ifstream f(vtxFile);
	std::string source((std::istreambuf_iterator<char>(f)),
		std::istreambuf_iterator<char>());
	vtxShaderId = compileShader(source.c_str(), GL_VERTEX_SHADER);

	f = std::ifstream(fragFile);
	source = std::string((std::istreambuf_iterator<char>(f)),
		std::istreambuf_iterator<char>());
	fragShaderId = compileShader(source.c_str(), GL_FRAGMENT_SHADER);

	if (vtxShaderId && fragShaderId) {
		// Associate shader with program
		glAttachShader(programId, vtxShaderId);
		glAttachShader(programId, fragShaderId);
		glLinkProgram(programId);
		glValidateProgram(programId);

		// Check the status of the compile/link
		GLint logLen;
		glGetProgramiv(programId, GL_INFO_LOG_LENGTH, &logLen);
		if (logLen > 0) {
			char* log = (char*)malloc(logLen * sizeof(char));
			// Show any errors as appropriate
			glGetProgramInfoLog(programId, logLen, &logLen, log);
			std::cout << "Prog Info Log: " << std::endl << log << std::endl;
			free(log);
		}
	}
	if (vtxShaderId) {
		glDeleteShader(vtxShaderId);
	}
	if (fragShaderId) {
		glDeleteShader(fragShaderId);
	}
}

void Shader::render(SDL_Window* window, SDL_Renderer* renderer, SDL_Texture* buffer, float cameraY, float greyScaleAmount, float transition) {
	GLint oldProgramId;
	SDL_SetRenderTarget(renderer, shaderBuffer);
	//SDL_SetRenderTarget(renderer, NULL);

	if (programId != 0) {
		glGetIntegerv(GL_CURRENT_PROGRAM, &oldProgramId);
		glUseProgram(programId);
	}

	glEnable(GL_TEXTURE_2D);
	glActiveTexture(GL_TEXTURE0);

	SDL_GL_BindTexture(buffer, NULL, NULL);
	glUniform1i(glGetUniformLocation(programId, "texture0"), 0);

	glActiveTexture(GL_TEXTURE1);
	SDL_GL_BindTexture(lut, NULL, NULL);
	glUniform1i(glGetUniformLocation(programId, "lut"), 1);

	glActiveTexture(GL_TEXTURE2);
	SDL_GL_BindTexture(light, NULL, NULL);
	glUniform1i(glGetUniformLocation(programId, "light"), 2);

	glActiveTexture(GL_TEXTURE3);
	SDL_GL_BindTexture(border, NULL, NULL);
	glUniform1i(glGetUniformLocation(programId, "border"), 3);

	glActiveTexture(GL_TEXTURE4);
	SDL_GL_BindTexture(sunlight, NULL, NULL);
	glUniform1i(glGetUniformLocation(programId, "sunlight"), 4);
	
	glUniform1f(glGetUniformLocation(programId, "cameraY"), cameraY);

	glUniform1f(glGetUniformLocation(programId, "greyScaleAmount"), greyScaleAmount);

	glUniform1f(glGetUniformLocation(programId, "progress"), transition);

	GLfloat minx, miny, maxx, maxy;
	GLfloat minu, maxu, minv, maxv;

	// Coordenadas de la ventana donde pintar.
	minx = 0.0f;
	miny = 0.0f;
	maxx = 1920;
	maxy = 1080;

	minu = 0.0f;
	maxu = 1.0f;
	minv = 0.0f;
	maxv = 1.0f;

	glBegin(GL_TRIANGLE_STRIP);
	glTexCoord2f(minu, minv);
	glVertex2f(minx, miny);
	glTexCoord2f(maxu, minv);
	glVertex2f(maxx, miny);
	glTexCoord2f(minu, maxv);
	glVertex2f(minx, maxy);
	glTexCoord2f(maxu, maxv);
	glVertex2f(maxx, maxy);
	glEnd();
	SDL_GL_SwapWindow(window);

	SDL_GL_UnbindTexture(buffer);
	SDL_GL_UnbindTexture(lut);

	if (programId != 0) {
		glUseProgram(oldProgramId);
	}

	SDL_SetRenderTarget(renderer, NULL);

	drect = { 0, 0, 1920, 1080 };
	SDL_RenderCopy(renderer, shaderBuffer, NULL, &drect);
}

GLuint Shader::compileShader(const char* source, GLuint shaderType) {
	// Create ID for shader
	GLuint result = glCreateShader(shaderType);
	// Define shader text
	glShaderSource(result, 1, &source, NULL);
	// Compile shader
	glCompileShader(result);

	//Check vertex shader for errors
	GLint shaderCompiled = GL_FALSE;
	glGetShaderiv(result, GL_COMPILE_STATUS, &shaderCompiled);
	if (shaderCompiled != GL_TRUE) {
		GLint logLength;
		glGetShaderiv(result, GL_INFO_LOG_LENGTH, &logLength);
		if (logLength > 0)
		{
			GLchar* log = (GLchar*)malloc(logLength);
			glGetShaderInfoLog(result, logLength, &logLength, log);
			std::cout << "Shader compile log:" << log << std::endl;
			free(log);
		}
		glDeleteShader(result);
		result = 0;
	}
	return result;
}

I know it’s such a mean thing to post a whole code here, but…
I think I have no choice other than this. If you have some time on you, and you check my code? I have those SDL_GL_SetAttribute in the main code.

This doesn’t show where you’re actually creating an OpenGL context, but it turns out that’s because you aren’t. Using SDL_Renderer to create an OpenGL context like you’re doing is the problem.

It isn’t going to matter what attributes you set via SDL_GL_SetAttribute(), since SDL_Renderer sets its own right before it creates an OpenGL 2.1 context, and once the context is created it doesn’t matter.

The problem, as stated by the shader compiler, is that you can’t use GLSL 3.3 (meant for OpenGL 3.3) shaders with an OpenGL 2.1 context. You’ll need to make whatever alterations to the shader are necessary to make it work with OpenGL 2.1, including changing the version number specified at the top of the shader (#version 120 since GLSL version numbers didn’t start matching the OpenGL version number until OpenGL 3.3).

Might be better in the long run to create the OpenGL context yourself so you’ll have control over it. Like,

Init SDL
Set OpenGL attributes with SDL_GL_SetAttribute()
Create window
Create OpenGL context with SDL_GL_CreateContext()
Load function pointers

And I’d also highly recommend something like GLAD to load the function pointers instead of doing it yourself.

And I think I mentioned this in an earlier reply, but if this same program with the same shader worked on Windows, it was probably an overly permissive OpenGL driver (Nvidia is known for their OpenGL driver letting you get away with stuff that isn’t supposed to work) whereas Apple’s OpenGL implementation is more strict.