Mercurial > sdl-ios-xcode
view test/automated/common/common.h @ 5043:da347bfed240
Florian Forster to sdl
in SDL 1.3 (revision 5508 from SVN), the method used to calculate the
bits per pixel from a “int format” differ between “SDL_ListModes” (which
always uses the “SDL_BITSPERPIXEL” macro) and “SDL_PixelFormatEnumTo-
Masks” (which uses either “SDL_BITSPERPIXEL” or “SDL_BYTESPERPIXEL * 8”,
depending on the value of “SDL_BYTESPERPIXEL”).
Because the values are later compared in “SDL_ListModes” this may lead
to some valid video modes not being returned. In my case the only mode
returned by “SDL_GetNumDisplayModes” was dismissed and NULL was
returned. (This led to the calling application sticking its head in the
sand.)
The attached patch copies the method used within “SDL_PixelFormatEnumTo-
Masks” to “SDL_ListModes”. This solved the problem for me though I don't
fully understand the method used by “SDL_PixelFormatEnumToMasks”.
author | Sam Lantinga <slouken@libsdl.org> |
---|---|
date | Wed, 19 Jan 2011 16:06:47 -0800 |
parents | 2c07bb579922 |
children |
line wrap: on
line source
/** * Automated SDL test common framework. * * Written by Edgar Simo "bobbens" * * Released under Public Domain. */ #ifndef COMMON_H # define COMMON_H # define FORMAT SDL_PIXELFORMAT_ARGB8888 # define AMASK 0xff000000 /**< Alpha bit mask. */ # define RMASK 0x00ff0000 /**< Red bit mask. */ # define GMASK 0x0000ff00 /**< Green bit mask. */ # define BMASK 0x000000ff /**< Blue bit mask. */ typedef struct SurfaceImage_s { int width; int height; unsigned int bytes_per_pixel; /* 3:RGB, 4:RGBA */ const unsigned char pixel_data[]; } SurfaceImage_t; #define ALLOWABLE_ERROR_OPAQUE 0 #define ALLOWABLE_ERROR_BLENDED 64 /** * @brief Compares a surface and a surface image for equality. * * @param sur Surface to compare. * @param img Image to compare against. * @return 0 if they are the same, -1 on error and positive if different. */ int surface_compare( SDL_Surface *sur, const SurfaceImage_t *img, int allowable_error ); #endif /* COMMON_H */