Converting SDL_Overlay YUV->RGB + copy into arbitrary rendering buffer

Hello everyone,

I am trying to change the code of ffplay (which is a simple video player
provided by the ffmpeg library) so, that it converts the current video
frame which is an YUV overlay into RGB or RGBA format and copies it to a
buffer in system memory. This is then used to create/update an openGL
texture, but in an ada progam which uses the ffplay functionality as
library. So I have only that buffer and do not want to mess around with
openGL in the SDL-part of the program.

I played around with the different ways of creating surfaces, but only
SDL_SetVideoMode worked so far, that I could read out the pixelbuffer of
the surface and copy it to the texture buffer. But that way I had
another window, which I don’t need.

It didn’t work with SDL_CreateRGBSurface () or SDL_CreateRGBSurfaceFrom.
When I used these two, though without calling SDL_SetVideoMode first,
but I got my surface. Only when I called SDL_DisplayYUVOverlay(vp->bmp,
&rect) it didn’t update the pixelbuffer of that surface.

The overlay is always created with the one surface which I get in using
one of these 3 functions, mentioned above.

The overlay is created like this:

vp->bmp = SDL_CreateYUVOverlay(is->video_st->codec->width,
is->video_st->codec->height, SDL_YV12_OVERLAY, screen);

The overlay definitely contains the current video frame (though not in
RGB of course).

So is there a way of calling SetVideoMode without creating a window? The
main window is already created in the ADA part and I only need the video
image in a texture buffer anyway.

If not, since I already have the YUVOverlay containing valid data, is
there a convenient way of converting it to RGB or RGBA into the buffer
where I need it? I think I can not use SDL_DisplayYUVOverlay(vp->bmp,
&rect) because it calls a function pointer in the end:

return overlay->hwfuncs->Display(current_video, overlay, &src, &dst);

But it does nothing when I do not use SetVideoMode (which will give me
the additional window)

And I don’t have access to that part of the overlay struct, or at least
I shouldn’t, it’s not called “private_yuvhwfuncs” for nothing.

Any suggestions greatly appreciated, thanks in advance.

Kind regards,

Denis

Hello everyone,

I am trying to change the code of ffplay (which is a simple video
player provided by the ffmpeg library) so, that it converts the
current video frame which is an YUV overlay into RGB or RGBA format
and copies it to a buffer in system memory. This is then used to
create/update an openGL texture, but in an ada progam which uses the
ffplay functionality as library. So I have only that buffer and do
not want to mess around with openGL in the SDL-part of the program.

You should modify the ffplay to produce you a RGB buffer instead. You
don’t need SDL(Surface) to do this.

Use the img_convert() of the libavcodec for that.

libav* API document
http://cekirdek.pardus.org.tr/~ismail/ffmpeg-docs/index.html

Even better is to make your own YUV conversion fragment shader and it
will do the conversion with GPU. Thus saving a lot of CPU power.

Here is one which uses Texture Rects, but it is easy to modify it to use
normal textures. This example uses three textures one for each
component. But it is quite trivial to use a one texture version as
well.

char *FShader=
“uniform sampler2DRect Ytex;\n”
“uniform sampler2DRect Utex,Vtex;\n”
“void main(void) {\n”
" float nx,ny,r,g,b,y,u,v;\n"
" vec4 txl,ux,vx;"
" nx=gl_TexCoord[0].x;\n"
" ny=576.0-gl_TexCoord[0].y;\n"
" y=texture2DRect(Ytex,vec2(nx,ny)).r;\n"
" u=texture2DRect(Utex,vec2(nx/2.0,ny/2.0)).r;\n"
" v=texture2DRect(Vtex,vec2(nx/2.0,ny/2.0)).r;\n"

" y=1.1643*(y-0.0625);\n"
" u=u-0.5;\n"
" v=v-0.5;\n"

" r=y+1.5958v;\n"
" g=y-0.39173
u-0.81290v;\n"
" b=y+2.017
u;\n"

" gl_FragColor=vec4(r,g,b,1.0);\n"
"}\n";On Thursday 07 September 2006 16:33, Mohnhaupt, Denis wrote:

I played around with the different ways of creating surfaces, but
only SDL_SetVideoMode worked so far, that I could read out the
pixelbuffer of the surface and copy it to the texture buffer. But
that way I had another window, which I don’t need.

It didn’t work with SDL_CreateRGBSurface () or
SDL_CreateRGBSurfaceFrom. When I used these two, though without
calling SDL_SetVideoMode first, but I got my surface. Only when I
called SDL_DisplayYUVOverlay(vp->bmp, &rect) it didn’t update the
pixelbuffer of that surface.

The overlay is always created with the one surface which I get in
using one of these 3 functions, mentioned above.

The overlay is created like this:

vp->bmp = SDL_CreateYUVOverlay(is->video_st->codec->width,
is->video_st->codec->height, SDL_YV12_OVERLAY, screen);

The overlay definitely contains the current video frame (though not
in RGB of course).

So is there a way of calling SetVideoMode without creating a window?
The main window is already created in the ADA part and I only need
the video image in a texture buffer anyway.

If not, since I already have the YUVOverlay containing valid data, is
there a convenient way of converting it to RGB or RGBA into the
buffer where I need it? I think I can not use
SDL_DisplayYUVOverlay(vp->bmp, &rect) because it calls a function
pointer in the end:

return overlay->hwfuncs->Display(current_video, overlay, &src, &dst);

But it does nothing when I do not use SetVideoMode (which will give
me the additional window)

And I don’t have access to that part of the overlay struct, or at
least I shouldn’t, it’s not called “private_yuvhwfuncs” for nothing.

Any suggestions greatly appreciated, thanks in advance.

Kind regards,

Denis