From 19ebba03117aefc9d0312f675f3a210ffdcc4907 Mon Sep 17 00:00:00 2001 From: Robert Ancell Date: Wed, 2 Feb 2022 14:03:13 +1300 Subject: Fix the check for maximum value of LZW initial code size. This value is the number of bits for each symbol (i.e. colour index) decoded via LZW. The maximum LZW code is specified as 12 bits, so the value here can only be 11 as two additional code words are required (clear and end of information) that immediately uses an additional bit. This implementation has always been wrong, and the Firefox implementation has the same issue so it seems a common misinterpretation of the spec. This has been changed here to avoid an assertion later in the LZW decoder. Note that there is never any reason for a GIF to be encoded with more than 8 bits of colour information, as the colour tables only support up to 8 bits. --- gdk-pixbuf/io-gif.c | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) (limited to 'gdk-pixbuf') diff --git a/gdk-pixbuf/io-gif.c b/gdk-pixbuf/io-gif.c index 1befba155..310bdff6a 100644 --- a/gdk-pixbuf/io-gif.c +++ b/gdk-pixbuf/io-gif.c @@ -499,8 +499,8 @@ gif_prepare_lzw (GifContext *context) /*g_message (_("GIF: EOF / read error on image data\n"));*/ return -1; } - - if (context->lzw_set_code_size > 12) { + + if (context->lzw_set_code_size >= 12) { g_set_error_literal (context->error, GDK_PIXBUF_ERROR, GDK_PIXBUF_ERROR_CORRUPT_IMAGE, -- cgit v1.2.1