diff options
author | David Mitchell <davem@iabyn.com> | 2011-05-10 17:24:29 +0100 |
---|---|---|
committer | David Mitchell <davem@iabyn.com> | 2011-05-19 14:49:43 +0100 |
commit | ee872193302939c724fd6c2c18071c621bfac6c4 (patch) | |
tree | 9649f1d0a1d70dd68afd4642f6b90f78fb405ff0 /hv.c | |
parent | 272e8453abcb0fceb34b1464670386e03a1f55bb (diff) | |
download | perl-ee872193302939c724fd6c2c18071c621bfac6c4.tar.gz |
remove 'hfreeentries failed to free hash' panic
Currently perl attempts to clear a hash 100 times before panicking.
So for example, if a naughty destructor keeps adding things back into the
hash, this will eventually panic.
Note that this can usually only occur with %h=() or undef(%h), since
when freeing a hash, there's usually no reference to the hash that a
destructor can use to mess with the hash.
Remove this limit (so it may potentially loop forever).
My reasoning is that (a) if the user wants to keep adding things back into
the hash, who are we to stop her? (b) as part of of the process of making
sv_clear() non-recursive when freeing hashes, I'm trying to reduce the
amount of state that must be maintained between each iteration.
Note that arrays currently don't have a limit.
Diffstat (limited to 'hv.c')
-rw-r--r-- | hv.c | 7 |
1 files changed, 1 insertions, 6 deletions
@@ -1630,7 +1630,6 @@ S_clear_placeholders(pTHX_ HV *hv, U32 items) STATIC void S_hfreeentries(pTHX_ HV *hv) { - int attempts = 100; STRLEN i = 0; const bool mpm = PL_phase != PERL_PHASE_DESTRUCT && HvENAME(hv); @@ -1689,12 +1688,8 @@ S_hfreeentries(pTHX_ HV *hv) * re-allocated, HvMAX changed etc */ continue; } - if (i++ >= HvMAX(hv)) { + if (i++ >= HvMAX(hv)) i = 0; - if (--attempts == 0) { - Perl_die(aTHX_ "panic: hfreeentries failed to free hash - something is repeatedly re-creating entries"); - } - } } /* while */ } |