| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
| |
|
|
|
|
| |
Fixes #20928
|
|
|
|
|
|
| |
(Fixes #10616 and #10617)
Co-authored-by: Roland Senn <rsx@bluewin.ch>
|
| |
|
|
|
|
|
| |
They are likely broken for the same reason as FreeBSD where the tests
are already disabled.
|
|
|
|
|
|
|
| |
The test currently times out waiting for end of stdin in getContents.
The expected output indicates that nothing should come for the test to
pass as written. It is unclear how the test was supposed to pass, but
this looks like a sufficient hack to make it work.
|
|
|
|
|
|
|
|
|
| |
Diagnostics for outofmem test on OpenBSD includes the amount of memory
that it failed to allocate. This seems like an irrelevant detail that
could change over time and isn't required for determining if test
passed.
Typical elided text is '(requested 2148532224 bytes)'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
ASSERT should be used in situations where something very bad will happen
later on if a certain invariant doesn't hold. The idea is that IF we
catch the assertion earlier then it will be easier to work out what's
going on at that point rather than at some indeterminate point in the
future of the program.
The assertions in Stats.c do not obey this philsophy and it is quite
annoying if you are running a debug build (or a ticky compiler) and one
of these assertions fails right at the end of your program, before the
ticky report is printed out so you don't get any profiling information.
Given that nothing terrible happens if these assertions are not true, or
at least the terrible thing will happen in very close proximity to the
assertion failure, these assertions use the new WARN macro which prints
the assertion failure to stdout but does not exit the program.
Of course, it would be better to fix these metrics to not trigger the
assertion in the first place but if they did fail again in the future it
is frustrating to be bamboozled in this manner.
Fixes #20899
|
|
|
|
| |
Part of #20889
|
|
|
|
|
|
|
|
|
| |
The documentation states that the interactive flags should be use for
any interactive expressions. The interactive flags are used when
typechecking these expressions but not when printing. The session flags
(modified by :set) are only used when loading a module.
Fixes #20909
|
|
|
|
| |
This makes it more similar to pprTrace, pprPanic etc.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The `GHC.Tc.Plugin.newWanted` function takes a `CtLoc` as an argument,
but it used to discard the location information, keeping only
the `CtOrigin`. It would then retrieve the source location from the
`TcM` environment using `getCtLocM`.
This patch changes this so that `GHC.Tc.Plugin.newWanted` passes on
the full `CtLoc`. This means that authors of type-checking plugins
no longer need to manually set the `CtLoc` environment in the `TcM`
monad if they want to create a new Wanted constraint with the given
`CtLoc` (in particular, for setting the `SrcSpan` of an emitted
constraint). This makes the `newWanted` function consistent with
`newGiven`, which always used the full `CtLoc` instead of using
the environment.
Fixes #20895
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The pretty-printing of partially applied unboxed sums was incorrect,
as we incorrectly dropped the first half of the arguments, even
for a partial application such as
(# | #) @IntRep @DoubleRep Int#
which lead to the nonsensical (# DoubleRep | Int# #).
This patch also allows users to write unboxed sum type constructors
such as
(# | #) :: TYPE r1 -> TYPE r2 -> TYPE (SumRep '[r1,r2]).
Fixes #20858 and #20859.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Uses of a TyCon in a kind signature required users to enable
DataKinds, which didn't make much sense, e.g. in
type U = Type
type MyMaybe (a :: U) = MyNothing | MyJust a
Now the DataKinds error is restricted to data constructors;
the use of kind-level type constructors is instead gated behind
-XKindSignatures.
This patch also adds a convenience pattern synonym for patching
on both a TyCon or a TcTyCon stored in a TcTyThing, used in
tcTyVar and tc_infer_id.
fixes #20873
|
| |
|
|
|
|
|
| |
The bug it regresses didn't happen on this OS (no RLIMIT_AS) and the
regression doesn't work (ulimit: -v: unknown option)
|
|
|
|
|
|
|
|
|
|
| |
Previously we would unconditionally provide a declaration for `environ`,
even if `<unistd.h>` already provided one. This would result in
`-Werror` builds failing on some platforms.
Also `#include <unistd.h>` to ensure that the declaration is visible.
Fixes #20861.
|
|
|
|
|
|
|
|
|
| |
As noted in #19029, currently `ghc-prim` explicitly lists `libc` in
`extra-libraries`, resulting in incorrect link ordering with the
`extra-libraries: pthread` in `libHSrts`. Fix this by adding an explicit
dependency on `libc` to `libHSrts`.
Closes #19029.
|
|
|
|
|
|
|
| |
This fixes serious skew in the performance numbers because the packages
were build with core-lint.
Fixes #20826
|
| |
|
|
|
|
|
|
| |
We use `git ls-files` to get the list of files to include in the source distribution.
Also implements the `-testsuite` and `-extra-tarballs` distributions.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch adds the ability to fetch and store dependencies needed for
boostrapping hadrian. By default the script will download the
dependencies from the network but some package managers disallow network
access so there are also options to build given a supplied tarball.
The -s option allos you to provide the tarball
bootstrap.py -d plan-bootstrap-8.10.5.json -w /path/to-ghc -s sources-tarball.tar.gz
Which dependencies you need can be queried using the `list-sources` option.
bootstrap.py list-sources -d plan-bootstrap-8.10.5.json
This produces `fetch_plan.json` which tells you where to get each source from.
You can instruct the script to create the tarball using the `fetch` option.
bootstrap.py fetch -d plan-bootstrap-8.10.5.json -o sources-tarball.tar.gz
Together these commands mean you can build GHC without needing
cabal-install.
Fixes #17103
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
These scripts are originally from the cabal-install repo with a few
small tweaks.
This utility allows you to build hadrian without cabal-install, which can be useful
for packagers. If you are a developer then build hadrian using cabal-install.
If you want to bootstrap with ghc-8.10.5 then run the ./bootstrap script with the
`plan-bootstrap-8.10.5.json` file.
bootstrap.py -d plan-bootstrap-8.10.5.json -w /path/to-ghc
The result of the bootstrap script will be a hadrian binary in
`_build/bin/hadrian`.
There is a script (using nix) which can be used to generate the bootstrap plans for the range
of supported GHC versions using nix.
generate_bootstrap_plans
Otherwise you can run the commands in ./generate_bootstrap_plans directly.
Fixes #17103
|
|
|
|
| |
that note was removed in 4196969c53c55191e644d9eb258c14c2bc8467da
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
We noticed that the structure of CoreUnfolding could leave double the
amount of CoreExprs which were retained in the situation where the
template but not all the predicates were forced. This observation was
then confirmed using ghc-debug:
```
(["ghc:GHC.Core:App","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 237)
(["ghc:GHC.Core:App","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","ghc-prim:GHC.Types:True"],Count 1)
(["ghc:GHC.Core:Case","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 12)
(["ghc:GHC.Core:Cast","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","BLACKHOLE"],Count 1)
(["ghc:GHC.Core:Cast","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 78)
(["ghc:GHC.Core:Cast","ghc-prim:GHC.Types:True","THUNK_1_0","ghc-prim:GHC.Types:False","THUNK_1_0"],Count 1)
(["ghc:GHC.Core:Cast","ghc-prim:GHC.Types:True","ghc-prim:GHC.Types:False","THUNK_1_0","THUNK_1_0"],Count 3)
(["ghc:GHC.Core:Cast","ghc-prim:GHC.Types:True","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0"],Count 1)
(["ghc:GHC.Core:Lam","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","BLACKHOLE"],Count 31)
(["ghc:GHC.Core:Lam","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 4307)
(["ghc:GHC.Core:Lam","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","ghc-prim:GHC.Types:True"],Count 6)
(["ghc:GHC.Core:Let","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 29)
(["ghc:GHC.Core:Lit","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","ghc-prim:GHC.Types:True"],Count 1)
(["ghc:GHC.Core:Tick","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 36)
(["ghc:GHC.Core:Var","ghc-prim:GHC.Types:True","THUNK_1_0","THUNK_1_0","THUNK_1_0"],Count 1)
(["ghc:GHC.Core:Var","ghc-prim:GHC.Types:True","ghc-prim:GHC.Types:False","THUNK_1_0","THUNK_1_0"],Count 6)
(["ghc:GHC.Core:Var","ghc-prim:GHC.Types:True","ghc-prim:GHC.Types:False","ghc-prim:GHC.Types:True","THUNK_1_0"],Count 2)
```
Where we can see that the first argument is forced but there are still
thunks remaining which retain the old expr.
For my test case (a very big module, peak of 3 000 000 core terms) this
reduced peak memory usage by 1G (12G -> 11G).
Fixes #20905
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The test only wants 1000 descriptors, so changing the limit to double
that *in the context of just this test* makes no sense.
This is a manual revert of 8f7194fae23bdc6db72fc5784933f50310ce51f9.
The justification given in the description doesn't instill confidence.
As of HEAD, the test fails on OpenBSD where ulimit -n is hard-limited
to 1024. The test suite attempts to change it to 2048, which
fails. The test proceeds with the unchanged default of 512 and
naturally the test program fails due to the low ulimit. The fixed test
now passes.
|
| |
|
|
|
|
|
|
|
|
| |
Use primOpId instead of mkPrimOpId in a few places to benefit from
Id caching.
I had to mess a little bit with the module hierarchy to fix cycles and
to avoid adding too many new dependencies to count-deps tests.
|
|
|
|
|
|
| |
SmallArray doesn't perform bounds check (faster).
Make primop tags start at 0 to avoid index arithmetic.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
When quoting (using a TH single or double quote) a built-in
name such as the list constructor (:), we didn't always check
that the resulting 'Name' was in the correct namespace.
This patch adds a check in GHC.Rename.Splice to ensure
we get a Name that is in the term-level/type-level namespace,
when using a single/double tick, respectively.
Fixes #20884.
|
| |
|
|
|
|
| |
Issue #18045 got fixed by !6971.
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
The test now passes on OpenBSD instead of generating broken source
which was rejected by GHC with
ManyAlternatives.hs:5:1: error:
The type signature for ‘f’ lacks an accompanying binding
|
| |
|
|
|
|
|
|
|
|
|
|
| |
The user's guide failed to explicitly mention that GADTSyntax
can be used to declare newtypes, so we add an example and a couple
of explanations.
Also explains that `-XGADTs` generalises `-XExistentialQuantification`.
Fixes #20848 and #20865.
|
|
|
|
|
|
|
|
| |
Function `lookupIPE` does not modify its argument. Reflect this in the
type. Module `CloneStack.c` relies on this for RTS without tables
next to code.
Fixes #20879
|
|
|
|
| |
Closes #20874
|
|
|
|
| |
This yields a small, but measurable, performance improvement.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Multiple home units allows you to load different packages which may depend on
each other into one GHC session. This will allow both GHCi and HLS to support
multi component projects more naturally.
Public Interface
~~~~~~~~~~~~~~~~
In order to specify multiple units, the -unit @⟨filename⟩ flag
is given multiple times with a response file containing the arguments for each unit.
The response file contains a newline separated list of arguments.
```
ghc -unit @unitLibCore -unit @unitLib
```
where the `unitLibCore` response file contains the normal arguments that cabal would pass to `--make` mode.
```
-this-unit-id lib-core-0.1.0.0
-i
-isrc
LibCore.Utils
LibCore.Types
```
The response file for lib, can specify a dependency on lib-core, so then modules in lib can use modules from lib-core.
```
-this-unit-id lib-0.1.0.0
-package-id lib-core-0.1.0.0
-i
-isrc
Lib.Parse
Lib.Render
```
Then when the compiler starts in --make mode it will compile both units lib and lib-core.
There is also very basic support for multiple home units in GHCi, at the
moment you can start a GHCi session with multiple units but only the
:reload is supported. Most commands in GHCi assume a single home unit,
and so it is additional work to work out how to modify the interface to
support multiple loaded home units.
Options used when working with Multiple Home Units
There are a few extra flags which have been introduced specifically for
working with multiple home units. The flags allow a home unit to pretend
it’s more like an installed package, for example, specifying the package
name, module visibility and reexported modules.
-working-dir ⟨dir⟩
It is common to assume that a package is compiled in the directory
where its cabal file resides. Thus, all paths used in the compiler
are assumed to be relative to this directory. When there are
multiple home units the compiler is often not operating in the
standard directory and instead where the cabal.project file is
located. In this case the -working-dir option can be passed which
specifies the path from the current directory to the directory the
unit assumes to be it’s root, normally the directory which contains
the cabal file.
When the flag is passed, any relative paths used by the compiler are
offset by the working directory. Notably this includes -i and
-I⟨dir⟩ flags.
-this-package-name ⟨name⟩
This flag papers over the awkward interaction of the PackageImports
and multiple home units. When using PackageImports you can specify
the name of the package in an import to disambiguate between modules
which appear in multiple packages with the same name.
This flag allows a home unit to be given a package name so that you
can also disambiguate between multiple home units which provide
modules with the same name.
-hidden-module ⟨module name⟩
This flag can be supplied multiple times in order to specify which
modules in a home unit should not be visible outside of the unit it
belongs to.
The main use of this flag is to be able to recreate the difference
between an exposed and hidden module for installed packages.
-reexported-module ⟨module name⟩
This flag can be supplied multiple times in order to specify which
modules are not defined in a unit but should be reexported. The
effect is that other units will see this module as if it was defined
in this unit.
The use of this flag is to be able to replicate the reexported
modules feature of packages with multiple home units.
Offsetting Paths in Template Haskell splices
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When using Template Haskell to embed files into your program,
traditionally the paths have been interpreted relative to the directory
where the .cabal file resides. This causes problems for multiple home
units as we are compiling many different libraries at once which have
.cabal files in different directories.
For this purpose we have introduced a way to query the value of the
-working-dir flag to the Template Haskell API. By using this function we
can implement a makeRelativeToProject function which offsets a path
which is relative to the original project root by the value of
-working-dir.
```
import Language.Haskell.TH.Syntax ( makeRelativeToProject )
foo = $(makeRelativeToProject "./relative/path" >>= embedFile)
```
> If you write a relative path in a Template Haskell splice you should use the makeRelativeToProject function so that your library works correctly with multiple home units.
A similar function already exists in the file-embed library. The
function in template-haskell implements this function in a more robust
manner by honouring the -working-dir flag rather than searching the file
system.
Closure Property for Home Units
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For tools or libraries using the API there is one very important closure
property which must be adhered to:
> Any dependency which is not a home unit must not (transitively) depend
on a home unit.
For example, if you have three packages p, q and r, then if p depends on
q which depends on r then it is illegal to load both p and r as home
units but not q, because q is a dependency of the home unit p which
depends on another home unit r.
If you are using GHC by the command line then this property is checked,
but if you are using the API then you need to check this property
yourself. If you get it wrong you will probably get some very confusing
errors about overlapping instances.
Limitations of Multiple Home Units
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
There are a few limitations of the initial implementation which will be smoothed out on user demand.
* Package thinning/renaming syntax is not supported
* More complicated reexports/renaming are not yet supported.
* It’s more common to run into existing linker bugs when loading a
large number of packages in a session (for example #20674, #20689)
* Backpack is not yet supported when using multiple home units.
* Dependency chasing can be quite slow with a large number of
modules and packages.
* Loading wired-in packages as home units is currently not supported
(this only really affects GHC developers attempting to load
template-haskell).
* Barely any normal GHCi features are supported, it would be good to
support enough for ghcid to work correctly.
Despite these limitations, the implementation works already for nearly
all packages. It has been testing on large dependency closures,
including the whole of head.hackage which is a total of 4784 modules
from 452 packages.
Internal Changes
~~~~~~~~~~~~~~~~
* The biggest change is that the HomePackageTable is replaced with the
HomeUnitGraph. The HomeUnitGraph is a map from UnitId to HomeUnitEnv,
which contains information specific to each home unit.
* The HomeUnitEnv contains:
- A unit state, each home unit can have different package db flags
- A set of dynflags, each home unit can have different flags
- A HomePackageTable
* LinkNode: A new node type is added to the ModuleGraph, this is used to
place the linking step into the build plan so linking can proceed in
parralel with other packages being built.
* New invariant: Dependencies of a ModuleGraphNode can be completely
determined by looking at the value of the node. In order to achieve
this, downsweep now performs a more complete job of downsweeping and
then the dependenices are recorded forever in the node rather than
being computed again from the ModSummary.
* Some transitive module calculations are rewritten to use the
ModuleGraph which is more efficient.
* There is always an active home unit, which simplifies modifying a lot
of the existing API code which is unit agnostic (for example, in the
driver).
The road may be bumpy for a little while after this change but the
basics are well-tested.
One small metric increase, which we accept and also submodule update to
haddock which removes ExtendedModSummary.
Closes #10827
-------------------------
Metric Increase:
MultiLayerModules
-------------------------
Co-authored-by: Fendor <power.walross@gmail.com>
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
This completes the fix for #20779 / !7123.
Beforehand, the program worked by accident because the two versions of
the library happened to be ordered properly (due to how the hashes were
computed). In the real world I observed them being the other way around
which meant the final lookup failed because we weren't filtering for
visibility.
I modified the test so that it failed (and it's fixed by this patch).
|
|
|
|
| |
Fixes #20854
|
|
|
|
| |
See ticket #20852
|
|
|
|
|
|
|
|
|
|
| |
If you specify PERF_BASELINE_COMMIT then this can fail if the specific
commit you selected didn't have perf test metrics. (This can happen in
CI for example if a build fails on master).
Therefore instead of just reporting all tests as new, we start searching
downwards from this point to try and find a good commit to report
numbers from.
|