| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
It's important that when -finfo-table-map is enabled that we generate
IPE entries just for those info tables which are actually used. To this
end, the info tables which are used are collected just before code
generation starts and entries only created for those tables.
Not accounted for in this scheme was the dead code elimination in the
native code generator. When compiling GHC this optimisation removed an
info table which had an IPE entry which resulting in the following kind
of linker error:
```
/home/matt/ghc-with-debug/_build/stage1/lib/../lib/x86_64-linux-ghc-9.3.20210928/libHSCabal-3.5.0.0-ghc9.3.20210928.so: error: undefined reference to '.Lc5sS_info'
/home/matt/ghc-with-debug/_build/stage1/lib/../lib/x86_64-linux-ghc-9.3.20210928/libHSCabal-3.5.0.0-ghc9.3.20210928.so: error: undefined reference to '.Lc5sH_info'
/home/matt/ghc-with-debug/_build/stage1/lib/../lib/x86_64-linux-ghc-9.3.20210928/libHSCabal-3.5.0.0-ghc9.3.20210928.so: error: undefined reference to '.Lc5sm_info'
collect2: error: ld returned 1 exit status
`cc' failed in phase `Linker'. (Exit code: 1)
Development.Shake.cmd, system command failed
```
Unfortunately, by the time this optimisation happens the structure of
the CmmInfoTable has been lost, we only have the generated code for the
info table to play with so we can no longer just collect all the used
info tables and generate the IPE map.
This leaves us with two options:
1. Return a list of the names of the discarded info tables and then
remove them from the map. This is awkward because we need to do code
generation for the map as well.
2. Just disable this small code size optimisation when -finfo-table-map
is enabled. The option produces very big object files anyway.
Option 2 is much easier to implement and means we don't have to thread
information around awkwardly. It's at the cost of slightly larger object
files (as dead code is not eliminated).
Disabling this optimisation allows an IPE build of GHC to complete
successfully.
Fixes #20428
|
| |
|
| |
|
|
|
|
| |
Closes #20013
|
|
|
|
|
| |
The tests Capi_Ctype_001 Capi_Ctype_002 T12010 pass regularly on CI so
let's mark them unbroken and hopefully then we can fix #20013.
|
|
|
|
|
|
|
|
| |
The current code assumes the non-moving generation is always
generation 1, but this isn't the case if the amount of generations
is greater than 2
Fixes #20461
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
previous attempts at fixing #11547 and #20455 were reverted because they
showed some quadratic behaviour, and the test case T15052 was added to
catch that.
I believe that similar quadratic behavor can be triggered with current
master, by using type definitions rather than value definitions, so this
adds a test case similar to T14052. I have hopes that my attempts at
fixing #11547 will lead to code that avoid the quadratic increase here.
Or not, we will see. In any case, having this in `master` and included
in future comparisons will be useful.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Before this patch, plugin units were linked with the target code even
when the unit was passed via `-plugin-package`. This is an issue to
support plugins in cross-compilers (plugins are definitely not ABI
compatible with target code).
We now clearly separate unit dependencies for plugins and unit
dependencies for target code and only link the latter ones.
We've also added a test to ensure that plugin units passed via
`-package` are linked with target code so that `thNameToGhcName` can
still be used in plugins that need it (see T20218b).
|
|
|
|
|
|
|
|
|
| |
commit 98c7749 has reverted commit 59d7ee53, including the test that
that file added. That test case is still valuable, so I am re-adding it.
I add it with it’s current (broken) behavior so that whoever fixes it
intentionally or accidentially will notice and then commit the actual
desired behavior (which is kinda unspecified, see
https://gitlab.haskell.org/ghc/ghc/-/issues/20455#note_382030)
|
| |
|
| |
|
|
|
|
|
| |
We don't need built-in rules now that bignum literals (e.g. 123 :: Natural)
match with their constructors (e.g. NS 123##).
|
|
|
|
|
|
|
|
|
|
| |
Perform constant folding on bigNatCompare instead.
Some functions of the Enum class for Natural now need to be inlined
explicitly to be specialized at call sites (because `x > lim` for
Natural is inlined and the resulting function is a little too big to
inline). If we don't do this, T17499 runtime allocations regresses by
16%.
|
|
|
|
| |
We now perform constant folding on bigNatEq# instead.
|
|
|
|
|
|
|
|
|
| |
The EpaDelta variant of EpaLocation cannot be sorted by location.
So we capture any comments that need to be printed between the prior
output and this location, when creating an EpaDelta offset in ghc-exactprint.
And make the EpaLocation fields strict.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
| |
The space requirements of the non-moving gc are comparable to the
compacting gc, not the copying gc.
The copying gc requires a much larger overhead.
Fixes #20475
|
|
|
|
| |
This was supposed to refer to #20253.
|
|
|
|
|
|
|
|
| |
Technically we should probably generate this in the in-place build tree
as well, but I am not bothering to do so here as ghcii.sh will be
removed in 9.4 when WinIO becomes the default anyways (see #12720).
Fixes #19339.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
There were two problems around `mkDictErr`:
1. An outdated call to `flattenTys` meant that we missed out on some
instances. As we no longer flatten type-family applications,
the logic is obsolete and can be removed.
2. We reported "out of scope" errors in a poly-kinded situation
because `BoxedRep` and `Lifted` were considered out of scope.
We fix this by using `pretendNameIsInScope`.
fixes #20465
|
|
|
|
|
|
|
|
|
|
|
| |
Before if you passed both options then you would generate two identical
hi/dyn_hi and o/dyn_o files, both in the dynamic way. It's better to
warn this is happening rather than duplicating the work and causing
potential confusion.
-dynamic-too should only be used with -static.
Fixes #20436
|
|
|
|
| |
For #16040 and #2387.
|
|
|
|
|
|
|
|
|
|
| |
This test fails on GHC 8.0.1, only when profiling is enabled,
with the error:
ghc: panic! (the 'impossible' happened)
kindPrimRep.go a_12
This was fixed by commit b460d6c9.
|
|
|
|
|
|
|
| |
There's no need for this `Maybe`, as it will always be instantiated to `Just`
in practice.
Fixes #20482.
|
|
|
|
|
|
|
| |
The parens EPAs were added in the tyvars where they belong, but also
at the top level of the declaration.
Closes #20452
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
While the thread ids had been changed to 64 bit words in
e57b7cc6d8b1222e0939d19c265b51d2c3c2b4c0 the return type of the foreign
import function used to retrieve these ids - namely
'GHC.Conc.Sync.getThreadId' - was never updated accordingly.
In order to fix that this function returns now a 'CUULong'.
In addition to that the types used in the thread labeling subsystem were
adjusted as well and several format strings were modified throughout the
whole RTS to display thread ids in a consistent and correct way.
Fixes #16761
|
|
|
|
| |
It may not always be a Unicode encoding
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Ticket #20200 (the Agda failure) showed another case in which
lookupIdSubst would fail to find a local Id in the InScopeSet.
This time it was because SetLevels was given a program in which
the top-level bindings were not in dependency order.
The Simplifier (see Note [Glomming] in GHC.Core.Opt.Occuranal) and
the specialiser (see Note [Top level scope] in GHC.Core.Opt.Specialise)
may both produce top-level bindings where an early binding refers
to a later one.
One solution would be to run the occurrence analyser again to
put them all in the right order. But a simpler one is to make
SetLevels OK with this input by bringing all top-level binders into
scope at the start. That's what this patch does.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This (big) commit finishes porting the GHC.Tc.Deriv module to support
the new diagnostic infrastructure (#18516) by getting rid of the legacy
calls to `TcRnUnknownMessage`. This work ended up being quite pervasive
and touched not only the Tc.Deriv module but also the Tc.Deriv.Utils and
Tc.Deriv.Generics module, which needed to be adapted to use the new
infrastructure. This also required generalising `Validity`.
More specifically, this is a breakdown of the work done:
* Add and use the TcRnUselessTypeable data constructor
* Add and use TcRnDerivingDefaults data constructor
* Add and use the TcRnNonUnaryTypeclassConstraint data constructor
* Add and use TcRnPartialTypeSignatures
* Add T13324_compile2 test to test another part of the
TcRnPartialTypeSignatures diagnostic
* Add and use TcRnCannotDeriveInstance data constructor, which introduces a
new data constructor to TcRnMessage called TcRnCannotDeriveInstance, which
is further sub-divided to carry a `DeriveInstanceErrReason` which explains
the reason why we couldn't derive a typeclass instance.
* Add DerivErrSafeHaskellGenericInst data constructor to DeriveInstanceErrReason
* Add DerivErrDerivingViaWrongKind and DerivErrNoEtaReduce
* Introduce the SuggestExtensionInOrderTo Hint, which adds (and use) a new
constructor to the hint type `LanguageExtensionHint` called `SuggestExtensionInOrderTo`,
which can be used to give a bit more "firm" recommendations when it's
obvious what the required extension is, like in the case for the
`DerivingStrategies`, which automatically follows from having enabled
both `DeriveAnyClass` and `GeneralizedNewtypeDeriving`.
* Wildcard-free pattern matching in mk_eqn_stock, which removes `_` in
favour of pattern matching explicitly on `CanDeriveAnyClass` and
`NonDerivableClass`, because that determine whether or not we can
suggest to the user `DeriveAnyClass` or not.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit makes the `Validity` type polymorphic:
```
data Validity' a
= IsValid -- ^ Everything is fine
| NotValid a -- ^ A problem, and some indication of why
-- | Monomorphic version of @Validity'@ specialised for 'SDoc's.
type Validity = Validity' SDoc
```
The type has been (provisionally) renamed to Validity' to not break
existing code, as the monomorphic `Validity` type is quite pervasive
in a lot of signatures in GHC.
Why having a polymorphic Validity? Because it carries the evidence of
"what went wrong", but the old type carried an `SDoc`, which clashed
with the new GHC diagnostic infrastructure (#18516). Having it
polymorphic it means we can carry an arbitrary, richer diagnostic type,
and this is very important for things like the
`checkOriginativeSideConditions` function, which needs to report the
actual diagnostic error back to `GHC.Tc.Deriv`.
It also generalises Validity-related functions to be polymorphic in @a@.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
This is a writeup of the state of play for better than linear `elem` via
a helper type class.
|
| |
|
|
|
|
|
|
| |
- Add link to laws from the class head
- Simplify wording of left/right associativity intro paragraph
- Avoid needless mention of "endomorphisms"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
We've had Sum CPR (#5075) for top-level bindings for a couple of years now.
That begs the question why we didn't also activate it for local bindings, and
the reasons for that are described in `Note [CPR for sum types]`. Only that it
didn't make sense! The Note said that Sum CPR would destroy let-no-escapes, but
that should be a non-issue since we have syntactic join points in Core now and
we don't WW for them (`Note [Don't w/w join points for CPR]`).
So I simply activated CPR for all bindings of sum type, thus fixing #5075 and
\#16570. NoFib approves:
```
--------------------------------------------------------------------------------
Program Allocs Instrs
--------------------------------------------------------------------------------
comp_lab_zift -0.0% +0.7%
fluid +1.7% +0.7%
reptile +0.1% +0.1%
--------------------------------------------------------------------------------
Min -0.0% -0.2%
Max +1.7% +0.7%
Geometric Mean +0.0% +0.0%
```
There were quite a few metric decreases on the order of 1-4%, but T6048 seems to
regress significantly, by 26.1%. WW'ing for a `Just` constructor and the nested
data type meant additional Simplifier iterations and a 30% increase in term
sizes as well as a 200-300% in type sizes due to unboxed 9-tuples. There's not
much we can do about it, I'm afraid: We're just doing much more work there.
Metric Decrease:
T12425
T18698a
T18698b
T20049
T9020
WWRec
Metric Increase:
T6048
|
|
|
|
|
|
|
|
| |
The examples in the Note were inaccurate (`$s$dm` has arity 1 and that seems OK)
and the code didn't actually nuke the demand *signature* anyway. Specialise has
to nuke it, but it starts from a clean IdInfo anyway (in `newSpecIdM`).
So I just deleted the code. Fixes #20450.
|
|
|
|
|
|
|
|
|
|
|
|
| |
In #18824 we saw that the Simplifier didn't nuke a CPR signature of a join point
when it pushed a continuation into it when it better should have.
But join points are local, mostly non-exported bindings. We don't use their
CPR signature anyway and would discard it at the end of the Core pipeline.
Their main purpose is to propagate CPR info during CPR analysis and by the time
worker/wrapper runs the signature will have served its purpose. So we zap it!
Fixes #18824.
|
|
|
|
|
|
| |
We should reject "type family Foo where Bar = ()".
This check was done in kcTyFamInstEqn but not in tcTyFamInstEqn.
I factored out arity checking, which was duplicated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
By adding an early abort flag in `TcSEnv`, we can fail fast in the presence of
insoluble constraints. This helps us avoid a lot of work in valid hole-fits, and
we geta massive speed-up by avoiding a lot of useless work solving constraints that
never come into play.
Additionally, we add a simple check for degenerate hole types, such as
when the type of the hole is an immutable type variable (as is the case
when the hole is completely unconstrained). Then the only valid fits are
the locals, so we can ignore the global candidates.
This fixes #16875
|
|
|
|
| |
The underlying bug was fixed by b8d98827, see MR !2477
|
|
|
|
|
|
|
|
| |
Not bumping the TcLevel meant that we could end up
trying to add evidence terms for the implication constraint
created to wrap failing kind equalities (to avoid their deferral).
fixes #20043
|
|
|
|
|
|
|
|
|
|
|
|
| |
The EpaAnnCO we were using contained an Anchor instead of EpaLocation,
making it harder to work with.
At the same time, using EpaLocation by itself isn't possible either,
as we may have tokens without location information.
Hence the new data type:
data TokenLocation = NoTokenLoc
| TokenLoc !EpaLocation
|
|
|
|
|
|
|
|
|
| |
Previously registration of ticky entry counters was racy, performing a
read-modify-write to add the new counter to the ticky_entry_ctrs list.
This could result in the list becoming cyclic if multiple threads
entered the same closure simultaneously.
Fixes #20451.
|
| |
|
|
|
|
|
|
| |
This reduces the output from the testsuite to a more manageable level.
Fixes #20432
|