# HG changeset patch # User Sam Lantinga # Date 1142437669 0 # Node ID ab1e4c41ab7118c9c1c5dbefcc5915a968a6452f # Parent 98f9b16f565c9dbf13fea45faa242a4b61e1b067 Fixed bug #33 Mike Frysinger wrote: > with libsdl-1.2.9, some games (like bomberclone) started > segfaulting in Gentoo [...snip...] > the last change in the last hunk: [...snip...] > if i change the statement to read: > (table[which].blit_features & GetBlitFeatures()) == GetBlitFeatures() > bomberclone no longer segfaults on my box Alex Volkov wrote: > The test "(table[which].blit_features & GetBlitFeatures()) == > table[which].blit_features)" is correct, and the previous > "(table[which].cpu_mmx == SDL_HasMMX())" was actually broken. I think there is potentially a slightly different cause of the above problem. During the introduction of the Altivec code, the blit_table struct field 'alpha' got changed from a straightforward enum to a bitmask, which makes perfect sense by itself. However, now the table driven blitter selection code in SDL_CalculateBlitN() can choose the wrong blitters when searching for a NO_ALPHA blitter because of the following code: int a_need = 0; ... (a_need & table[which].alpha) == a_need && When searching through the normal_blit_2[] table, a SET_ALPHA blitter (like Blit_RGB565_ARGB8888) can now be selected instead of a NO_ALPHA one, causing alpha channel bits to appear in a non-alpha destination surface. I suppose this could theoretically be an indirect cause of the segfault mentioned above. I *think* this can be fixed by changing to int a_need = NO_ALPHA; diff -r 98f9b16f565c -r ab1e4c41ab71 src/video/SDL_blit_N.c --- a/src/video/SDL_blit_N.c Wed Mar 15 15:43:15 2006 +0000 +++ b/src/video/SDL_blit_N.c Wed Mar 15 15:47:49 2006 +0000 @@ -2433,7 +2433,7 @@ } } else { /* Now the meat, choose the blitter we want */ - int a_need = 0; + int a_need = NO_ALPHA; if(dstfmt->Amask) a_need = srcfmt->Amask ? COPY_ALPHA : SET_ALPHA; table = normal_blit[srcfmt->BytesPerPixel-1];