summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorJason Evans <jasone@canonware.com>2016-11-03 22:21:34 -0700
committerJason Evans <jasone@canonware.com>2016-11-03 22:36:30 -0700
commit32896a902bb962a06261d81c9be22e16210692db (patch)
treea007d0ae30e93f4f8c3497ff36adf62527731c9f
parente9012630acf897ce7016e427354bb46fbe893fe1 (diff)
downloadjemalloc-32896a902bb962a06261d81c9be22e16210692db.tar.gz
Fix large allocation to search optimal size class heap.
Fix arena_run_alloc_large_helper() to not convert size to usize when searching for the first best fit via arena_run_first_best_fit(). This allows the search to consider the optimal quantized size class, so that e.g. allocating and deallocating 40 KiB in a tight loop can reuse the same memory. This regression was nominally caused by 5707d6f952c71baa2f19102479859012982ac821 (Quantize szad trees by size class.), but it did not commonly cause problems until 8a03cf039cd06f9fa6972711195055d865673966 (Implement cache index randomization for large allocations.). These regressions were first released in 4.0.0. This resolves #487.
-rw-r--r--src/arena.c2
1 files changed, 1 insertions, 1 deletions
diff --git a/src/arena.c b/src/arena.c
index a7fe34a9..43c3ccf2 100644
--- a/src/arena.c
+++ b/src/arena.c
@@ -1059,7 +1059,7 @@ arena_run_first_best_fit(arena_t *arena, size_t size)
static arena_run_t *
arena_run_alloc_large_helper(arena_t *arena, size_t size, bool zero)
{
- arena_run_t *run = arena_run_first_best_fit(arena, s2u(size));
+ arena_run_t *run = arena_run_first_best_fit(arena, size);
if (run != NULL) {
if (arena_run_split_large(arena, run, size, zero))
run = NULL;