How to use GLESv2 functions with SDL2?

Hello,

I tried to use GLESv2 with SDL2. It works (a little) with my integrated
intel chipset, but not at all (segfault) with my Nvidia Geforce460.

I’m using ArchLinux as OS, with proprietary nvidia drivers (the latest,
or perhaps two months old).

So i tried to use GLESv2 the ?SDL2 way? and look in the SDL2 test
directory to see how it was done.

I’ve understood that every OpenGL functions need to be re-declared.

So i tried, and now it segfault everywhere!

#include <SDL2/SDL.h>
#include<SDL2/SDL_opengles2.h>

typedef struct GLES2_Context
{
#define SDL_PROC(ret,func,params) ret (APIENTRY *func) params;
// SDL_gles2funcs.h is stolen from SDL2 sources
// it is NOT installed in /usr/include/SDL2 by default.
#include “SDL_gles2funcs.h”
#undef SDL_PROC
} GLES2_Context;

GLES2_Context ctx; //ne pas confondre avec SDL_GLContext

If i look ctx with gdb, the coredump is easy to understand:

(gdb) print ctx
$1 = {glActiveTexture = 0x0, glAttachShader = 0x0, glBindAttribLocation
= 0x0, glBindTexture = 0x0, glBlendFuncSeparate = 0x0, glClear = 0x0,
glClearColor = 0x0, glCompileShader = 0x0, glCreateProgram = 0x0,
glCreateShader = 0x0, ? (like this to the end of the struct.)

Can you help me, please ?

Thank youk.

You need to declare the GLESv2 entry points as function pointers and each
one using SDL_GL_GetProcAddress().

2014-04-30 12:08 GMT+02:00 jseb :> Hello,

I tried to use GLESv2 with SDL2. It works (a little) with my integrated
intel chipset, but not at all (segfault) with my Nvidia Geforce460.

I’m using ArchLinux as OS, with proprietary nvidia drivers (the latest,
or perhaps two months old).

So i tried to use GLESv2 the ?SDL2 way? and look in the SDL2 test
directory to see how it was done.

I’ve understood that every OpenGL functions need to be re-declared.

So i tried, and now it segfault everywhere!

#include <SDL2/SDL.h>
#include<SDL2/SDL_opengles2.h>

typedef struct GLES2_Context
{
#define SDL_PROC(ret,func,params) ret (APIENTRY *func) params;
// SDL_gles2funcs.h is stolen from SDL2 sources
// it is NOT installed in /usr/include/SDL2 by default.
#include “SDL_gles2funcs.h”
#undef SDL_PROC
} GLES2_Context;

GLES2_Context ctx; //ne pas confondre avec SDL_GLContext

If i look ctx with gdb, the coredump is easy to understand:

(gdb) print ctx
$1 = {glActiveTexture = 0x0, glAttachShader = 0x0, glBindAttribLocation
= 0x0, glBindTexture = 0x0, glBlendFuncSeparate = 0x0, glClear = 0x0,
glClearColor = 0x0, glCompileShader = 0x0, glCreateProgram = 0x0,
glCreateShader = 0x0, ? (like this to the end of the struct.)

Can you help me, please ?

Thank youk.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

2014-04-30 7:08 GMT-03:00 jseb :

Hello,

I tried to use GLESv2 with SDL2. It works (a little) with my integrated
intel chipset, but not at all (segfault) with my Nvidia Geforce460.

Are you linking your app to libGL.so, libEGL.so, etc? Nvidia ships their
own libGL.so which is incompatible with Mesa’s (and bundles the
functionality of other libraries!), so this would explain why it works on
Intel and not Nvidia.
The best way to do this is to not link to libGL.so (etc) and load it
dynamically (which is not the same as “re declaring” them). This means,
opening the libGL.so file at runtime and picking up the symbols you want
from there, which is what some of SDL’s test programs do.

This ensures you are using the runtime libGL.so and friends, as opposed to
the compile time libGL.so which is whatever the build system picked up when
you built your application (and yes, they can be different even if you do
all this in the same system, specially when the Nvidia binaries are
present).

To complicate things further, you are using GLES2, so depending on the
platform you are using either EGL (Mesa abiding platforms like Intel), or
GLX (“f…developers” abiding platforms like Nvidia :slight_smile: ) to create the
context.–
Gabriel.