Wrong color format when using sdl2.ext.pixels2d

Issue #103 invalid
Arnaud Durand
created an issue

When using sdl2.ext.pixels2d instead of sdl2.ext.PixelView, colors created using sdl2.ext.Color() have a wrong format.

From my experiments, every color channel is shifted of one byte:

PixelView pixels2d
red ?
green red
blue green
alpha blue

I created the surface and the pixels view using:

    surface_p = sdl2.SDL_CreateRGBSurface(0, X_RES, Y_RES, 32, 0, 0, 0, 0)
    pixelview = sdl2.ext.pixels2d(surface_p.contents)

Comments (3)

  1. Marcus von Appen repo owner

    The surface does not feature an alpha channel and you also leave it up to the SDL library to choose the pixel layout. This most likely results in the following bitmask for your surface:

    ALPHA = 0x00000000
    RED   = 0x00FF0000
    GREEN = 0x0000FF00
    BLUE  = 0x000000FF
    

    What does it mean for the byte usage on direct access? My assumption is that SDL uses an XRGB packed order on your system, causing an ARGB layout (or more specifically XRGB, where A is unused, since you do not use an alpha channel). pixels2d() does not guarantee an order, but merely provides a direct memory buffer access through numpy, thus it is up to the caller to use the appropriate conversion routine to get the correct color.

    The PixelView uses the helper function prepare_color, which, based on the surface's pixel format information, converts the raw pixel data to the correct color.

    Did you use a proper conversion routine to get the correct color from the numpy array (pixels2d())?

  2. Log in to comment