Texture Streaming vs. Target

Hello,

I have a series of 8-bit greyscale images that I render to a texture - like a video. What I have is
a block of unsigned char memory for the pixels.

The only way I can see how to do this is to create a RGB24 texture (SDL_TEXTUREACCESS_STREAMING) and
then using LockTexture() and UnLockTexture() and convert the greyscale pixels to RGB24. Now I would
like to draw on the textures using SDL_RenderDrawLine() (and others) and using SDL2_gfx. However, to
do so, I need to set the texture as the renderer’s target. For this to happen, I need to set the
texture to SDL_TEXTUREACCESS_TARGET. When I do this, LockTexture() complains that the texture is not
streaming.

So, my questions are:

  1. What is the correct way to render 8-bit greyscale images? In SDL 1.2 I use
    SDL_CreateRGBSurfaceFrom() to create a surface with a depth of 8. Then I would blit that surface
    onto the main display surface (usually having a depth of 24 or 32 - RGB/A). I tried doing the same
    in SDL2, but when I created a texture from the 8-bit surface, rendering that texture to my main
    texture (not default texture) would fail or the main texture would end up being all white. The main
    texture was configured as RGB24.

  2. Does it makes sense to want a streaming texture that can also be a render target? If not, please
    clarify.

  3. Does it make sense that a target texture can be locked or is there something fundamental that I
    am not understanding (very likely!).

Many thanks.

Regards,

Alvin

Try using SDL_UpdateTexture() to update texture without locking.

  1. Convert grayscale image to 24 bit using SDL_SurfaceConvert.
  2. Use grayscale image to SDL_UpdateTexture()
  3. Draw on the new texture afterwards.

It might work, it might not. Depends on whether SDL_UpdateTexture() works
with RenderTarget textures.
Do let us know if it does work!On Tue, Feb 11, 2014 at 1:30 AM, Alvin Beach wrote:

Hello,

I have a series of 8-bit greyscale images that I render to a texture -
like a video. What I have is
a block of unsigned char memory for the pixels.

The only way I can see how to do this is to create a RGB24 texture
(SDL_TEXTUREACCESS_STREAMING) and
then using LockTexture() and UnLockTexture() and convert the greyscale
pixels to RGB24. Now I would
like to draw on the textures using SDL_RenderDrawLine() (and others) and
using SDL2_gfx. However, to
do so, I need to set the texture as the renderer’s target. For this to
happen, I need to set the
texture to SDL_TEXTUREACCESS_TARGET. When I do this, LockTexture()
complains that the texture is not
streaming.

So, my questions are:

  1. What is the correct way to render 8-bit greyscale images? In SDL 1.2 I
    use
    SDL_CreateRGBSurfaceFrom() to create a surface with a depth of 8. Then I
    would blit that surface
    onto the main display surface (usually having a depth of 24 or 32 -
    RGB/A). I tried doing the same
    in SDL2, but when I created a texture from the 8-bit surface, rendering
    that texture to my main
    texture (not default texture) would fail or the main texture would end up
    being all white. The main
    texture was configured as RGB24.

  2. Does it makes sense to want a streaming texture that can also be a
    render target? If not, please
    clarify.

  3. Does it make sense that a target texture can be locked or is there
    something fundamental that I
    am not understanding (very likely!).

Many thanks.

Regards,

Alvin


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


Pallav Nawani
IronCode Gaming Private Limited
Website: http://www.ironcode.com
Twitter: http://twitter.com/Ironcode_Gaming
Facebook: http://www.facebook.com/Ironcode.Gaming
Mobile: 9997478768

Sorry, that would be:
2. Use 24 bit image to SDL_UpdateTexture()On Tue, Feb 11, 2014 at 6:59 PM, Pallav Nawani <@Pallav_Nawani>wrote:

Try using SDL_UpdateTexture() to update texture without locking.

  1. Convert grayscale image to 24 bit using SDL_SurfaceConvert.
  2. Use grayscale image to SDL_UpdateTexture()
  3. Draw on the new texture afterwards.

It might work, it might not. Depends on whether SDL_UpdateTexture() works
with RenderTarget textures.
Do let us know if it does work!

On Tue, Feb 11, 2014 at 1:30 AM, Alvin Beach wrote:

Hello,

I have a series of 8-bit greyscale images that I render to a texture -
like a video. What I have is
a block of unsigned char memory for the pixels.

The only way I can see how to do this is to create a RGB24 texture
(SDL_TEXTUREACCESS_STREAMING) and
then using LockTexture() and UnLockTexture() and convert the greyscale
pixels to RGB24. Now I would
like to draw on the textures using SDL_RenderDrawLine() (and others) and
using SDL2_gfx. However, to
do so, I need to set the texture as the renderer’s target. For this to
happen, I need to set the
texture to SDL_TEXTUREACCESS_TARGET. When I do this, LockTexture()
complains that the texture is not
streaming.

So, my questions are:

  1. What is the correct way to render 8-bit greyscale images? In SDL 1.2 I
    use
    SDL_CreateRGBSurfaceFrom() to create a surface with a depth of 8. Then I
    would blit that surface
    onto the main display surface (usually having a depth of 24 or 32 -
    RGB/A). I tried doing the same
    in SDL2, but when I created a texture from the 8-bit surface, rendering
    that texture to my main
    texture (not default texture) would fail or the main texture would end up
    being all white. The main
    texture was configured as RGB24.

  2. Does it makes sense to want a streaming texture that can also be a
    render target? If not, please
    clarify.

  3. Does it make sense that a target texture can be locked or is there
    something fundamental that I
    am not understanding (very likely!).

Many thanks.

Regards,

Alvin


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


Pallav Nawani
IronCode Gaming Private Limited
Website: http://www.ironcode.com
Twitter: http://twitter.com/Ironcode_Gaming
Facebook: http://www.facebook.com/Ironcode.Gaming
Mobile: 9997478768


Pallav Nawani
IronCode Gaming Private Limited
Website: http://www.ironcode.com
Twitter: http://twitter.com/Ironcode_Gaming
Facebook: http://www.facebook.com/Ironcode.Gaming
Mobile: 9997478768

Hello Pallav,

SDL_SurfaceConvert seems like it would solve my issue, however, I cannot seem to find it. I couldn’t
find it in the online wiki (http://wiki.libsdl.org/CategoryAPI#SDL_2.0_API_by_Name), nor can I find
it in the headers. I am using SDL2 2.0.1 on openSUSE 12.3.

Do you know where I can find SDL_SurfaceConvert?

Thanks,

AlvinOn 11/02/14 09:31, Pallav Nawani wrote:

Sorry, that would be:
2. Use 24 bit image to SDL_UpdateTexture()

On Tue, Feb 11, 2014 at 6:59 PM, Pallav Nawani <pallavnawani at gmail.com <mailto:pallavnawani at gmail.com>> wrote:

Try using SDL_UpdateTexture() to update texture without locking.

1. Convert grayscale image to 24 bit using SDL_SurfaceConvert.
2. Use grayscale image to SDL_UpdateTexture()
3. Draw on the new texture afterwards.

It might work, it might not. Depends on whether SDL_UpdateTexture() works with RenderTarget
textures.
Do let us know if it does work!


On Tue, Feb 11, 2014 at 1:30 AM, Alvin Beach <@Alvin_Beach <mailto:@Alvin_Beach>> wrote:

    Hello,

    I have a series of 8-bit greyscale images that I render to a texture - like a video. What I
    have is
    a block of unsigned char memory for the pixels.

    The only way I can see how to do this is to create a RGB24 texture
    (SDL_TEXTUREACCESS_STREAMING) and
    then using LockTexture() and UnLockTexture() and convert the greyscale pixels to RGB24. Now
    I would
    like to draw on the textures using SDL_RenderDrawLine() (and others) and using SDL2_gfx.
    However, to
    do so, I need to set the texture as the renderer's target. For this to happen, I need to set the
    texture to SDL_TEXTUREACCESS_TARGET. When I do this, LockTexture() complains that the
    texture is not
    streaming.

    So, my questions are:

    1. What is the correct way to render 8-bit greyscale images? In SDL 1.2 I use
    SDL_CreateRGBSurfaceFrom() to create a surface with a depth of 8. Then I would blit that surface
    onto the main display surface (usually having a depth of 24 or 32 - RGB/A). I tried doing
    the same
    in SDL2, but when I created a texture from the 8-bit surface, rendering that texture to my main
    texture (not default texture) would fail or the main texture would end up being all white.
    The main
    texture was configured as RGB24.

    2. Does it makes sense to want a streaming texture that can also be a render target? If not,
    please
    clarify.

    3. Does it make sense that a target texture can be locked or is there something fundamental
    that I
    am not understanding (very likely!).

    Many thanks.

    Regards,

    Alvin

[snip>

nm, I found it. It’s SDL_ConvertSurface :wink:

Cheers,

AlvinOn Tue, Feb 11, 2014 at 9:39 AM, Alvin Beach <@Alvin_Beach> wrote:

Hello Pallav,

SDL_SurfaceConvert seems like it would solve my issue, however, I cannot seem to find it. I couldn’t
find it in the online wiki (http://wiki.libsdl.org/CategoryAPI#SDL_2.0_API_by_Name), nor can I find
it in the headers. I am using SDL2 2.0.1 on openSUSE 12.3.

Do you know where I can find SDL_SurfaceConvert?

Thanks,

Alvin

On 11/02/14 09:31, Pallav Nawani wrote:

Sorry, that would be:
2. Use 24 bit image to SDL_UpdateTexture()

On Tue, Feb 11, 2014 at 6:59 PM, Pallav Nawani <pallavnawani at gmail.com <mailto:pallavnawani at gmail.com>> wrote:

Try using SDL_UpdateTexture() to update texture without locking.

1. Convert grayscale image to 24 bit using SDL_SurfaceConvert.
2. Use grayscale image to SDL_UpdateTexture()
3. Draw on the new texture afterwards.

It might work, it might not. Depends on whether SDL_UpdateTexture() works with RenderTarget
textures.
Do let us know if it does work!


On Tue, Feb 11, 2014 at 1:30 AM, Alvin Beach <@Alvin_Beach <mailto:@Alvin_Beach>> wrote:

    Hello,

    I have a series of 8-bit greyscale images that I render to a texture - like a video. What I
    have is
    a block of unsigned char memory for the pixels.

    The only way I can see how to do this is to create a RGB24 texture
    (SDL_TEXTUREACCESS_STREAMING) and
    then using LockTexture() and UnLockTexture() and convert the greyscale pixels to RGB24. Now
    I would
    like to draw on the textures using SDL_RenderDrawLine() (and others) and using SDL2_gfx.
    However, to
    do so, I need to set the texture as the renderer's target. For this to happen, I need to set the
    texture to SDL_TEXTUREACCESS_TARGET. When I do this, LockTexture() complains that the
    texture is not
    streaming.

    So, my questions are:

    1. What is the correct way to render 8-bit greyscale images? In SDL 1.2 I use
    SDL_CreateRGBSurfaceFrom() to create a surface with a depth of 8. Then I would blit that surface
    onto the main display surface (usually having a depth of 24 or 32 - RGB/A). I tried doing
    the same
    in SDL2, but when I created a texture from the 8-bit surface, rendering that texture to my main
    texture (not default texture) would fail or the main texture would end up being all white.
    The main
    texture was configured as RGB24.

    2. Does it makes sense to want a streaming texture that can also be a render target? If not,
    please
    clarify.

    3. Does it make sense that a target texture can be locked or is there something fundamental
    that I
    am not understanding (very likely!).

    Many thanks.

    Regards,

    Alvin

[snip>

Sorry, that would be:
2. Use 24 bit image to SDL_UpdateTexture()

Try using SDL_UpdateTexture() to update texture without locking.

1. Convert grayscale image to 24 bit using SDL_SurfaceConvert.
2. Use grayscale image to SDL_UpdateTexture()
3. Draw on the new texture afterwards.

It might work, it might not. Depends on whether SDL_UpdateTexture() works with RenderTarget
textures.
Do let us know if it does work!

[snip]


Pallav Nawani
IronCode Gaming Private Limited
Website: http://www.ironcode.com
Twitter: http://twitter.com/Ironcode_Gaming
Facebook: http://www.facebook.com/Ironcode.Gaming
Mobile: 9997478768

My attempts to use SDL_ConvertSurface is failing for some reason. Here’s an example of what I am
doing. This is horribly inefficient (memleaks, etc.). I am just experimenting at first.

// f is a struct of my raw 8bit greyscale.
// f->pixels is an array of unsigned char
// f->w is the width (columns): 640
// f->h is the hight (rows): 320
// f->d is the depth (bits per pixel): 8
SDL_Surface *s8 = SDL_CreateRGBSurfaceFrom(f->pixels, f->w, f->h, f->d, f->pitch, \
	0, 0, 0, 0);

// texture (below) has been created using format SDL_PIXELFORMAT_RGB24
SDL_Surface *s24 = SDL_ConvertSurfaceFormat(s8, SDL_PIXELFORMAT_RGB24, 0);

// texture is the main texture that will be rendered
SDL_UpdateTexture(texture, NULL, s24->pixels, s24->pitch);

SDL_RenderCopy(renderer, texture, NULL, NULL);

SDL_RenderPresent(renderer);

With the above order of operations, the result displays an all white texture.

When creating the 8-bit surface (s8) using any masks (other than 0’s) results in NULL being
returned. Looking at the source (SDL_surface.c), it looks like the call to
SDL_MasksToPixelFormatEnum() returns SDL_PIXELFORMAT_UNKNOWN for a depth of 8 and masks other than
all 0’s. When I use all 0’s for masks, s8->format->format is SDL_PIXELFORMAT_INDEX8.

Also, the Remarks in the wiki states[1]:

“If depth is 4 or 8 bits, an empty palette is allocated for the surface.”

Perhaps this is new in SDL2? Not sure how “an empty palette” effects things. My code to create an
8-bit surface in SDL1.2 is essentially the same, except I specify the mask = {0xff, 0xff, 0xff, 0}.

I’m wonder if the lack of an 8-bit greyscale PixelFormatEnum is the issue?

For me, the real issue is the ability to be able to draw on a texture (SDL_RenderDrawLine, SDL2_gfx,
etc.) which I cannot do if the texture access is SDL_TEXTUREACCESS_STREAMING which is required to
convert 8-bit to 24-bit.

[1] http://wiki.libsdl.org/SDL_CreateRGBSurface#Remarks

Thanks,

AlvinOn 11/02/14 09:31, Pallav Nawani wrote:

On Tue, Feb 11, 2014 at 6:59 PM, Pallav Nawani <pallavnawani at gmail.com <mailto:pallavnawani at gmail.com>> wrote:

Looks like the “empty palette” was the issue. I took a look at SDL2_image and how it handles 8-bit
greyscale images, in particular, PGM. Looks like what I was missing was initialising the pallet
elements.

I ripped this from IMG_pnm.c:

SDL_Color *c = s8->format->palette->colors;

for(int i = 0; i < 256; i++)
	c[i].r = c[i].g = c[i].b = i;

s8->format->palette->ncolors = 256;

Now SDL_ConvertSurface() works and the main texture displays correctly and can be drawn on.

In fact, I now pass my 8-bit surface (with proper init’d pallet) to SDL_CreateTextureFromSurface()
and copy that texture onto my main texture:

SDL_Texture *t8 = SDL_CreateTextureFromSurface(renderer, s8);

SDL_SetRenderTarget(renderer, texture);
SDL_RenderCopy(renderer, t8, NULL, NULL);
SDL_SetRenderTarget(renderer, NULL);

I guess now the question is: Within the context of video playback, which is better in terms of
efficiency, updating my main texture with the pixels from a 24-bit surface created by calling
SDL_ConvertSurface() or creating a texture directly from my 8-bit surface and copying that via
SDL_RenderCopy()?

Cheers,

AlvinOn 11/02/14 13:20, Alvin Beach wrote:

[snip]

When creating the 8-bit surface (s8) using any masks (other than 0’s) results in NULL being
returned. Looking at the source (SDL_surface.c), it looks like the call to
SDL_MasksToPixelFormatEnum() returns SDL_PIXELFORMAT_UNKNOWN for a depth of 8 and masks other than
all 0’s. When I use all 0’s for masks, s8->format->format is SDL_PIXELFORMAT_INDEX8.

Also, the Remarks in the wiki states[1]:

“If depth is 4 or 8 bits, an empty palette is allocated for the surface.”

Perhaps this is new in SDL2? Not sure how “an empty palette” effects things. My code to create an
8-bit surface in SDL1.2 is essentially the same, except I specify the mask = {0xff, 0xff, 0xff, 0}.

[snip]

Measure it! :wink:

Jonny DOn Tue, Feb 11, 2014 at 1:04 PM, Alvin Beach wrote:

On 11/02/14 13:20, Alvin Beach wrote:

[snip]

When creating the 8-bit surface (s8) using any masks (other than 0’s)
results in NULL being
returned. Looking at the source (SDL_surface.c), it looks like the call
to
SDL_MasksToPixelFormatEnum() returns SDL_PIXELFORMAT_UNKNOWN for a depth
of 8 and masks other than
all 0’s. When I use all 0’s for masks, s8->format->format is
SDL_PIXELFORMAT_INDEX8.

Also, the Remarks in the wiki states[1]:

“If depth is 4 or 8 bits, an empty palette is allocated for the
surface.”

Perhaps this is new in SDL2? Not sure how “an empty palette” effects
things. My code to create an
8-bit surface in SDL1.2 is essentially the same, except I specify the
mask = {0xff, 0xff, 0xff, 0}.

[snip]

Looks like the “empty palette” was the issue. I took a look at SDL2_image
and how it handles 8-bit
greyscale images, in particular, PGM. Looks like what I was missing was
initialising the pallet
elements.

I ripped this from IMG_pnm.c:

    SDL_Color *c = s8->format->palette->colors;

    for(int i = 0; i < 256; i++)
            c[i].r = c[i].g = c[i].b = i;

    s8->format->palette->ncolors = 256;

Now SDL_ConvertSurface() works and the main texture displays correctly and
can be drawn on.

In fact, I now pass my 8-bit surface (with proper init’d pallet) to
SDL_CreateTextureFromSurface()
and copy that texture onto my main texture:

    SDL_Texture *t8 = SDL_CreateTextureFromSurface(renderer, s8);

    SDL_SetRenderTarget(renderer, texture);
    SDL_RenderCopy(renderer, t8, NULL, NULL);
    SDL_SetRenderTarget(renderer, NULL);

I guess now the question is: Within the context of video playback, which
is better in terms of
efficiency, updating my main texture with the pixels from a 24-bit surface
created by calling
SDL_ConvertSurface() or creating a texture directly from my 8-bit surface
and copying that via
SDL_RenderCopy()?

Cheers,

Alvin


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Measure it! :wink:

Jonny D

[snip]

I guess now the question is: Within the context of video playback, which is better in terms of
efficiency, updating my main texture with the pixels from a 24-bit surface created by calling
SDL_ConvertSurface() or creating a texture directly from my 8-bit surface and copying that via
SDL_RenderCopy()?

Cheers,

Alvin

[snip]

Using a texture takes 0-2 ms whereas using a surface takes 3-6ms on my desktop. I’m thinking that
the texture implementation uses the GPU to do the 8bit to 24bit conversion whereas the surface
implementation uses the CPU.

My quick experiment used a reusable SDL_Surface to hold the 8-bit greyscale pixels. Using
SDL_GetTicks(), it seems pretty clear that, on my system, using:

SDL_Texture *t8 = SDL_CreateTextureFromSurface(renderer, surf8);

SDL_SetRenderTarget(renderer, texture);
SDL_RenderCopy(renderer, t8, NULL, NULL);
SDL_SetRenderTarget(renderer, NULL);

SDL_DestroyTexture(t8);

is much faster than using:

SDL_Surface *surf24 = SDL_ConvertSurfaceFormat(surf8, SDL_PIXELFORMAT_RGB24, 0);
SDL_SetSurfaceBlendMode(s24, SDL_BLENDMODE_NONE);

SDL_UpdateTexture(texture, NULL, surf24->pixels, surf24->pitch);

SDL_FreeSurface(surf24);

My target platforms are a desktop and Android. I still need to test on Android, but I expect
comparable results.

Cheers,

AlvinOn 11/02/14 14:06, Jonathan Dearborn wrote:

On Tue, Feb 11, 2014 at 1:04 PM, Alvin Beach wrote:

Are you able to call SDL_UpdateTexture() on a Render Target texture? If so
the problem is solved, the rest are mere details. So test that first.

s8 is a 8 bit surface, but s24 has 3 color components R, G, & B. Color
pallete is an indexed table which maps the 8 bit value to a corresponding
RGB value. For YOUR PARTICULAR CASE, You can construct it like so (there
are better ways to generate the pallete entries):

SDL_Palette aPalette;
aPalette.ncolors = 256;
aPalette.colors = SDL_Calloc(sizeof(SDL_Color), aPalette.ncolors);
for(int i=0; i < aPalette.ncolors; i++) {
aPalette.colors[i].r = i;
aPalette.colors[i].g = i;
aPalette.colors[i].b = i;
aPalette.colors[i].a = 255;
}

now attach aPalette to s8 before calling SDL_ConvertSurfaceFormat().

Better idea still, is to use the API functions SDL_AllocPalette()
and SDL_SetPaletteColors(), but the code I gave should explain what is
going on here.On Tue, Feb 11, 2014 at 10:50 PM, Alvin Beach wrote:

On 11/02/14 09:31, Pallav Nawani wrote:

Sorry, that would be:
2. Use 24 bit image to SDL_UpdateTexture()

On Tue, Feb 11, 2014 at 6:59 PM, Pallav Nawani <@Pallav_Nawani mailto:Pallav_Nawani> wrote:

Try using SDL_UpdateTexture() to update texture without locking.

1. Convert grayscale image to 24 bit using SDL_SurfaceConvert.
2. Use grayscale image to SDL_UpdateTexture()
3. Draw on the new texture afterwards.

It might work, it might not. Depends on whether SDL_UpdateTexture()

works with RenderTarget

textures.
Do let us know if it does work!

[snip]


Pallav Nawani
IronCode Gaming Private Limited
Website: http://www.ironcode.com
Twitter: http://twitter.com/Ironcode_Gaming
Facebook: http://www.facebook.com/Ironcode.Gaming
Mobile: 9997478768

My attempts to use SDL_ConvertSurface is failing for some reason. Here’s
an example of what I am
doing. This is horribly inefficient (memleaks, etc.). I am just
experimenting at first.

    // f is a struct of my raw 8bit greyscale.
    // f->pixels is an array of unsigned char
    // f->w is the width (columns): 640
    // f->h is the hight (rows): 320
    // f->d is the depth (bits per pixel): 8
    SDL_Surface *s8 = SDL_CreateRGBSurfaceFrom(f->pixels, f->w, f->h,

f->d, f->pitch,
0, 0, 0, 0);

    // texture (below) has been created using format

SDL_PIXELFORMAT_RGB24
SDL_Surface *s24 = SDL_ConvertSurfaceFormat(s8,
SDL_PIXELFORMAT_RGB24, 0);

    // texture is the main texture that will be rendered
    SDL_UpdateTexture(texture, NULL, s24->pixels, s24->pitch);

    SDL_RenderCopy(renderer, texture, NULL, NULL);

    SDL_RenderPresent(renderer);

With the above order of operations, the result displays an all white
texture.

When creating the 8-bit surface (s8) using any masks (other than 0’s)
results in NULL being
returned. Looking at the source (SDL_surface.c), it looks like the call to
SDL_MasksToPixelFormatEnum() returns SDL_PIXELFORMAT_UNKNOWN for a depth
of 8 and masks other than
all 0’s. When I use all 0’s for masks, s8->format->format is
SDL_PIXELFORMAT_INDEX8.

Also, the Remarks in the wiki states[1]:

“If depth is 4 or 8 bits, an empty palette is allocated for the surface.”

Perhaps this is new in SDL2? Not sure how “an empty palette” effects
things. My code to create an
8-bit surface in SDL1.2 is essentially the same, except I specify the mask
= {0xff, 0xff, 0xff, 0}.

I’m wonder if the lack of an 8-bit greyscale PixelFormatEnum is the issue?

For me, the real issue is the ability to be able to draw on a texture
(SDL_RenderDrawLine, SDL2_gfx,
etc.) which I cannot do if the texture access is
SDL_TEXTUREACCESS_STREAMING which is required to
convert 8-bit to 24-bit.

[1] http://wiki.libsdl.org/SDL_CreateRGBSurface#Remarks

Thanks,

Alvin


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


Pallav Nawani
IronCode Gaming Private Limited
Website: http://www.ironcode.com
Twitter: http://twitter.com/Ironcode_Gaming
Facebook: http://www.facebook.com/Ironcode.Gaming
Mobile: 9997478768

Hello Pallav,

I solved my problem in a previous email to this thread. As you mentioned, the issue was the palette
was not set correctly. I haven’t used colour palettes before. Basically ignored them until now.

I do appreciate the exposure to SDL_AllocPalette() and SDL_SetPaletteColors(). I looked at the
source for SDL_CreateRGBSurface (I really use SDL_CreateRGBSurfaceFrom, but that ultimately calls
SDL_CreateRGBSurface).

It appears that SDL_CreateRGBSurface calls SDL_AllocPalette which, in turn, allocates a palette of
256 colours which is memset to white (0xFF). This would explain why I was getting an all white texture.

For my purposes, I think I will stick with assigning the colour palette in a for-loop rather than
using SDL_SetPaletteColors since the palette has already been allocated. I admit though, this relies
on the implementation rather than the API. I suspect I will regret this sometime in the future ;).

The Remarks in the docs for SDL_CreateRGBSurface[1] are a little miss-leading though. For depth 8, I
would say the palette is not empty, but rather initialised to all white. To me, empty implies NULL
or uninitialised. Admittedly, this could just be due to my little exposure to colour palettes.

Thanks for your help.

Cheers,

AlvinOn 12/02/14 05:32, Pallav Nawani wrote:

Are you able to call SDL_UpdateTexture() on a Render Target texture? If so the problem is solved,
the rest are mere details. So test that first.

s8 is a 8 bit surface, but s24 has 3 color components R, G, & B. Color pallete is an indexed table
which maps the 8 bit value to a corresponding RGB value. For YOUR PARTICULAR CASE, You can construct
it like so (there are better ways to generate the pallete entries):

SDL_Palette aPalette;
aPalette.ncolors = 256;
aPalette.colors = SDL_Calloc(sizeof(SDL_Color), aPalette.ncolors);
for(int i=0; i < aPalette.ncolors; i++) {
aPalette.colors[i].r = i;
aPalette.colors[i].g = i;
aPalette.colors[i].b = i;
aPalette.colors[i].a = 255;
}

now attach aPalette to s8 before calling SDL_ConvertSurfaceFormat().

Better idea still, is to use the API functions SDL_AllocPalette() and SDL_SetPaletteColors(), but
the code I gave should explain what is going on here.

[snip]

I guess now the question is: Within the context of video playback, which
is better in terms of
efficiency, updating my main texture with the pixels from a 24-bit surface
created by calling
SDL_ConvertSurface() or creating a texture directly from my 8-bit surface
and copying that via
SDL_RenderCopy()?

I guess that if your source is GRAY8 you can also allocate your texture as
YUV and pass it without any conversion giving your pixel data as Y
component and passing fixed zeroed (or neutral, so memsetted to 127) U and
V component to SDL_UpdateYUVTexture().

This way you’ll avoid the conversion and you will transfer also less data
to the GFX card (U and V components are 4 times smaller than RGB ones).–
Bye,
Gabry

That is a great idea! I will have to try that. Thanks!

Cheers,

AlvinOn 12/02/14 10:30, Gabriele Greco wrote:

I guess now the question is: Within the context of video playback, which is better in terms of
efficiency, updating my main texture with the pixels from a 24-bit surface created by calling
SDL_ConvertSurface() or creating a texture directly from my 8-bit surface and copying that via
SDL_RenderCopy()?

I guess that if your source is GRAY8 you can also allocate your texture as YUV and pass it without
any conversion giving your pixel data as Y component and passing fixed zeroed (or neutral, so
memsetted to 127) U and V component to SDL_UpdateYUVTexture().

This way you’ll avoid the conversion and you will transfer also less data to the GFX card (U and V
components are 4 times smaller than RGB ones).


Bye,
Gabry
[snip]

Hello Gabry,

That was a brilliant idea. My (unscientific) testing shows that that the update to the screen is
0-1ms where 1ms is very rare - mostly 0ms!

The only issue that I have is that I cannot draw on the YUV texture. When I try to draw (after
having calling SDL_UpdateYUVTexture), I get the following error message from SDL2 when I try to set
the texture as the renderer’s target (e.g. SDL_SetRenderTarget(renderer, texture)):

“glFramebufferTexture2DEXT() failed”.

I create my texture using (excuse the line wrapping):

SDL_Texture *texture = SDL_CreateTexture(renderer,
SDL_PIXELFORMAT_YV12,
SDL_TEXTUREACCESS_TARGET,
w, h);

Perhaps I’ve created my texture wrong?

Cheers,

AlvinOn 12/02/14 10:40, Alvin Beach wrote:

On 12/02/14 10:30, Gabriele Greco wrote:

I guess now the question is: Within the context of video playback, which is better in terms of
efficiency, updating my main texture with the pixels from a 24-bit surface created by calling
SDL_ConvertSurface() or creating a texture directly from my 8-bit surface and copying that via
SDL_RenderCopy()?

I guess that if your source is GRAY8 you can also allocate your texture as YUV and pass it without
any conversion giving your pixel data as Y component and passing fixed zeroed (or neutral, so
memsetted to 127) U and V component to SDL_UpdateYUVTexture().

This way you’ll avoid the conversion and you will transfer also less data to the GFX card (U and V
components are 4 times smaller than RGB ones).


Bye,
Gabry
[snip]

That is a great idea! I will have to try that. Thanks!

Cheers,

Alvin

“glFramebufferTexture2DEXT() failed”.

There is a chance that a YUV texture cannot be a render target… never
used a YUV texture that way, usually to write over a video texture I use an
additional texture with transparent background, not a render target…–
Bye,
Gabry

I suspect it cannot be done. I posted the question as a new message to the list as it didn’t fit my
original topic.

Just as an FYI, I decided to go with two textures: one RGBA8888 and one YUV. The RGBA is the main
texture that is displayed on the screen. I update the YUV texture with the GREY8 pixels and faux
neutral UV. I then set the RGBA as the renderer’s target and call SDL_RenderCopy() to put the YUV
texture onto the RGBA texture. This seems to work well. My timings are still showing 0-1ms (debug
compile running in a gdb via Kdevelop4) with more 0’s than 1’s.

Thanks for the help.

Cheers,

AlvinOn 12/02/14 13:07, Gabriele Greco wrote:

   "glFramebufferTexture2DEXT() failed".

There is a chance that a YUV texture cannot be a render target… never used a YUV texture that way,
usually to write over a video texture I use an additional texture with transparent background, not a
render target…


Bye,
Gabry