summaryrefslogtreecommitdiff
path: root/sql/sql_string.h
diff options
context:
space:
mode:
authorGeorgi Kodinov <georgi.kodinov@oracle.com>2014-04-10 13:18:32 +0300
committerGeorgi Kodinov <georgi.kodinov@oracle.com>2014-04-10 13:18:32 +0300
commit37b9a31a3095dd8f4a15b957f1c4b28fe4fab4ed (patch)
treeac8da13ae17e0a391e2727dabc8d8ccac5cc518c /sql/sql_string.h
parent92351c831f7fefcbbd48c7e914225fdc55adad36 (diff)
downloadmariadb-git-37b9a31a3095dd8f4a15b957f1c4b28fe4fab4ed.tar.gz
Bug #18359924: INNODB AND MYISAM CORRUPTION ON PREFIX INDEXES
The problem was in the validation of the input data for blob types. When assigned binary data, the character blob types were only checking if the length of these data is a multiple of the minimum char length for the destination charset. And since e.g. UTF-8's minimum character length is 1 (becuase it's variable length) even byte sequences that are invalid utf-8 strings (e.g. wrong leading byte etc) were copied verbatim into utf-8 columns when coming from binary strings or fields. Storing invalid data into string columns was having all kinds of ill effects on code that assumed that the encoding data are valid to begin with. Fixed by additionally checking the incoming binary string for validity when assigning it to a non-binary string column. Made sure the conversions to charsets with no known "invalid" ranges are not covered by the extra check. Removed trailing spaces. Test case added.
Diffstat (limited to 'sql/sql_string.h')
-rw-r--r--sql/sql_string.h3
1 files changed, 3 insertions, 0 deletions
diff --git a/sql/sql_string.h b/sql/sql_string.h
index 234e8272b88..971af9ea91a 100644
--- a/sql/sql_string.h
+++ b/sql/sql_string.h
@@ -280,6 +280,9 @@ public:
static bool needs_conversion(uint32 arg_length,
CHARSET_INFO *cs_from, CHARSET_INFO *cs_to,
uint32 *offset);
+ static bool needs_conversion_on_storage(uint32 arg_length,
+ CHARSET_INFO *cs_from,
+ CHARSET_INFO *cs_to);
bool copy_aligned(const char *s, uint32 arg_length, uint32 offset,
CHARSET_INFO *cs);
bool set_or_copy_aligned(const char *s, uint32 arg_length, CHARSET_INFO *cs);