Even though they probably have one of the larger user base on that distribution and architecture, they may not have the resources to tackle this issue. It probably requires tracking down bugs in LLVM or compiling and testing the newer versions. LLVM is rather large.
Maybe downgrading to the old LLVM version will help? Did not try that on Debian yet.
I’m currently developing my app on Debian 9 Stretch on a laptop, and not a powerful one (MSI CR70).
Anyway, SDL2 runs in accelerated mode so I don’t have to complain about performances issue in my particular case.
Is there simple tests I could make to give you some infos to check if the problem really concerns Stretch or the Raspberry distro?
X-Video Extension version 2.2
screen #0
Adaptor #0: "GLAMOR Textured Video"
number of ports: 16
port base: 117
operations supported: PutImage
supported visuals:
depth 24, visualID 0x21
number of attributes: 5
"XV_BRIGHTNESS" (range -1000 to 1000)
client settable attribute
client gettable attribute (current value is 0)
"XV_CONTRAST" (range -1000 to 1000)
client settable attribute
client gettable attribute (current value is 0)
"XV_SATURATION" (range -1000 to 1000)
client settable attribute
client gettable attribute (current value is 0)
"XV_HUE" (range -1000 to 1000)
client settable attribute
client gettable attribute (current value is 0)
"XV_COLORSPACE" (range 0 to 1)
client settable attribute
client gettable attribute (current value is 0)
maximum XvImage size: 8192 x 8192
Number of image formats: 2
id: 0x32315659 (YV12)
guid: 59563132-0000-0010-8000-00aa00389b71
bits per pixel: 12
number of planes: 3
type: YUV (planar)
id: 0x30323449 (I420)
guid: 49343230-0000-0010-8000-00aa00389b71
bits per pixel: 12
number of planes: 3
type: YUV (planar)
I too have seen a extreme performance drop from SDL1 to SDL2 on the Raspberry Pi. I’m using a Atari ST (an 4MHz M68k computer) emulator that can use either SDL1 or SDL2 (compile-time option) - and I have not found any SDL compile-time option or anything like that, that can “make” SDL2 as fast as SDL1.
I’ve tried both the old firmware-side driver and the new from Anholt.
Oh, no. This is just a software renderer we’re talking about. Testing the issue probably requires building Mesa and/or LLVM which are rather large projects.
If you got one of the hardware accelerated renderers running, you’re all good.
What interface does SDL1 use to show stuff on screen? /dev/fb0 or a dispmanx element? Such an emulator usually likes to access pixels directly and the current renderers are not optimized for that. OpenGL ES might just be the wrong thing here. A dumb KMS buffer could be better (something I was planning on testing), but that’s currently not implemented in the new KMSDRM driver in SDL2.
Good question, ChliHug. I quite don’t know, but I know for sure that it’s not dispmanx as the emulator works out of the box on both Linux and OSX on other architectures. If you want, you can check it out your self: https://hg.tuxfamily.org/mercurialroot/hatari/hatari
I also want to try out the new experimental vc4 driver but I only have a Raspbian Jessie. When I open raspi-config on my Jessie system, there is no “GL (Full KMS)” option under “Advanced Options/GL Driver”. Instead, it just asks me whether I want to enable or disable the experimental OpenGL driver. When I enable it, SDL_GetCurrentVideoDriver() still returns “x11” but it is noticably faster. Does this mean that SDL is using the new vc4 driver now or could it also be that it is using its standard OpenGL for X11 driver? Is there any way I can check that SDL is using the new vc4 driver?
That’s odd. Are you sure you have the latest version? dpkg -s raspi-config | grep Version shows “20170705” for me.
X is probably using the vc4 driver. There are a few ways you can check things.
if the vc4 kernel module has been loaded: lsmod | grep vc4 should show a vc4 line.
If the vc4 driver is active: The file /dev/dri/card0 should exist.
If X found the card0 file: Run /var/log/Xorg.0 | grep /dev/dri and it should say if it managed to open it or not.
What OpenGL renderer is being used: Run glxinfo | grep string and it should say “OpenGL renderer string: Gallium 0.4 on VC4 V3D 2.1”.
You can’t directly test if SDL is using it because it goes through X11. Unless you create your own OpenGL context, but that should be the same renderer string as above.
I can confirm that you will see the video driver as ‘x11’ and the renderer as ‘opengl’ irrespective of whether you are using the Mesa software driver or the VC4 accelerated driver. I don’t know of any direct way you can tell the difference via SDL; here all I see is a big increase in the rendering speed (much bigger on Raspbian Stretch than on Jessie).
Please be aware that (unless it has been fixed very recently) enabling the VC4 driver in Jessie breaks sound output via the HDMI port. The sound is there, but it is grossly distorted. This is fixed in Stretch, but my understanding is that it won’t be fixed in Jessie. If you’re not using HDMI sound this probably isn’t an issue for you.