diff options
author | Ivan Maidanski <ivmai@mail.ru> | 2023-03-25 13:00:08 +0300 |
---|---|---|
committer | Ivan Maidanski <ivmai@mail.ru> | 2023-04-12 11:54:22 +0300 |
commit | 274e5ced83975df4df880b7d6c32061fa1c67985 (patch) | |
tree | 3a11c54a4090a962f448e32c2ce6ab4437f4abb9 /malloc.c | |
parent | 76795d27d5679105f5dccbbdb58b334299e6c8cd (diff) | |
download | bdwgc-274e5ced83975df4df880b7d6c32061fa1c67985.tar.gz |
Do not add extra byte to large ignore-off-page objects
For ignore-off-page objects the client should guarantee the pointer
within the first heap block of the object, thus no need to add an extra
byte for such objects if the object size of at least one heap block.
* allchblk.c (setup_header): Add assertion that byte_sz is not less
than ALIGNMENT.
* allchblk.c [ALIGNMENT>GC_DS_TAGS] (setup_header): Modify descr local
variable to make it zero if IGNORE_OFF_PAGE flag is set and kind is
NORMAL (and object size is not less than HBLKSIZE); add comment.
* mallocx.c [ALIGNMENT>GC_DS_TAGS] (GC_realloc): Likewise.
* include/gc/gc.h (GC_all_interior_pointers): Update comment.
* include/private/gc_priv.h [MAX_EXTRA_BYTES==0] (ADD_EXTRA_BYTES):
Define as no-op.
* malloc.c (GC_generic_malloc_inner): Define lb_adjusted local
variable; pass lb_adjusted to GC_alloc_large_and_clear().
* malloc.c [MAX_EXTRA_BYTES>0] (GC_generic_malloc_inner): Set
lb_adjusted to lb if IGNORE_OFF_PAGE flag is set and lb is not less
than HBLKSIZE.
* malloc.c [MAX_EXTRA_BYTES>0] (GC_generic_malloc_aligned): Set
lb_rounded without EXTRA_BYTES added (and compute lg based on
lb_rounded) if IGNORE_OFF_PAGE is set and lb is not less than HBLKSIZE.
* mallocx.c (GC_realloc): Define ok local variable.
* typd_mlc.c (GC_malloc_explicitly_typed_ignore_off_page): Remove
lb_adjusted local variable; call GC_malloc_explicitly_typed() if
lb is smaller than HBLKSIZE-sizeof(word), otherwise pass lb plus
sizeof(word) (instead of lb plus TYPD_EXTRA_BYTES) to
GC_generic_malloc_aligned; add comment.
Diffstat (limited to 'malloc.c')
-rw-r--r-- | malloc.c | 31 |
1 files changed, 27 insertions, 4 deletions
@@ -191,13 +191,24 @@ STATIC void * GC_generic_malloc_inner_small(size_t lb, int k) GC_INNER void * GC_generic_malloc_inner(size_t lb, int k, unsigned flags) { + size_t lb_adjusted; + GC_ASSERT(I_HOLD_LOCK()); GC_ASSERT(k < MAXOBJKINDS); if (SMALL_OBJ(lb)) { return GC_generic_malloc_inner_small(lb, k); } - return GC_alloc_large_and_clear(ADD_EXTRA_BYTES(lb), k, flags); +# if MAX_EXTRA_BYTES > 0 + if ((flags & IGNORE_OFF_PAGE) != 0 && lb >= HBLKSIZE) { + /* No need to add EXTRA_BYTES. */ + lb_adjusted = lb; + } else +# endif + /* else */ { + lb_adjusted = ADD_EXTRA_BYTES(lb); + } + return GC_alloc_large_and_clear(lb_adjusted, k, flags); } #ifdef GC_COLLECT_AT_MALLOC @@ -229,9 +240,21 @@ GC_INNER void * GC_generic_malloc_aligned(size_t lb, int k, unsigned flags, size_t lb_rounded; GC_bool init; - if (EXPECT(0 == lb, FALSE)) lb = 1; - lg = ALLOC_REQUEST_GRANS(lb); - lb_rounded = GRANULES_TO_BYTES(lg); +# if MAX_EXTRA_BYTES > 0 + if ((flags & IGNORE_OFF_PAGE) != 0 && lb >= HBLKSIZE) { + /* No need to add EXTRA_BYTES. */ + lb_rounded = ROUNDUP_GRANULE_SIZE(lb); +# ifdef THREADS + lg = BYTES_TO_GRANULES(lb_rounded); +# endif + } else +# endif + /* else */ { + if (EXPECT(0 == lb, FALSE)) lb = 1; + lg = ALLOC_REQUEST_GRANS(lb); + lb_rounded = GRANULES_TO_BYTES(lg); + } + init = GC_obj_kinds[k].ok_init; if (EXPECT(align_m1 < GRANULE_BYTES, TRUE)) { align_m1 = 0; |