| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
| |
Tracking ticket: #23056
MR: !10399
This adds the flag `-funoptimized-core-for-interpreter`, permitting use
of the `-O` flag to enable optimizations when compiling with the
interpreter backend, like in ghci.
|
|
|
|
|
|
|
|
|
|
|
| |
This commit adds support for computing an inputs hash for packages
compiled by hadrian. The result is that ABI incompatible packages should
be given different hashes and therefore be distinct in a cabal store.
Hashing is enabled by the `--flag`, and is off by default as the hash
contains a hash of the source files. We enable it when we produce
release builds so that the artifacts we distribute have the right unit
ids.
|
|
|
|
| |
This patch includes all wasm32-specific testsuite fixes.
|
|
|
|
|
|
|
| |
This patch adds the req_host_target_ghc predicate to the testsuite to
assert the ghc compiler being tested can compile both host/target
code. When testing cross GHCs this is not supported yet, but it may
change in the future.
|
|
|
|
|
|
| |
This patch adds the req_process predicate to the testsuite to assert
the platform has a process model, also marking tests that involve
spawning processes as req_process. Also bumps hpc & process submodule.
|
|
|
|
|
|
|
| |
This patch adds the req_ghc_with_threaded_rts predicate to the
testsuite to assert the platform has threaded RTS, and mark some tests
as req_ghc_with_threaded_rts. Also makes ghc_with_threaded_rts a
config field instead of a global variable.
|
|
|
|
|
|
|
|
| |
When the testsuite driver copy files instead of symlinking them, it
should also copy the permission bits, otherwise there'll be permission
denied errors. Also, enforce file copying when testing wasm32, since
wasmtime doesn't handle host symlinks quite well
(https://github.com/bytecodealliance/wasmtime/issues/6227).
|
|
|
|
|
|
| |
This patch implements logic to automatically exclude ghci ways when
there is no rts linker. It's way better than having to annotate
individual test cases.
|
|
|
|
|
|
| |
This patch fixes hp2ps related framework failures when testing the
wasm backend by including target exe extension in heap profile
filenames.
|
|
|
|
|
|
|
|
| |
This patch fixes cross prefix stripping in the testsuite driver. The
normalization logic used to only handle prefixes of the triple form
<arch>-<vendor>-<os>, now it's relaxed to allow any number of tokens
in the prefix tuple, so the cross prefix stripping logic would work
when ghc is configured with something like --target=wasm32-wasi.
|
|
|
|
| |
Confusingly, GhcDebugged referred to GhcDebugAssertions.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch refactors the testsuite driver, gets rid of multi-threading
logic for running test cases concurrently, and uses asyncio &
coroutines instead. This is not yak shaving for its own sake; the
previous multi-threading logic is prone to livelock/deadlock
conditions for some reason, even if the total number of threads is
bounded to a thread pool's capacity.
The asyncify change is an internal implementation detail of the
testsuite driver and does not impact most GHC maintainers out there.
The patch does not touch the .T files, test cases can be
added/modified the exact same way as before.
|
|
|
|
|
|
|
|
| |
This patch changes a thread-local variable to context variable
instead, which works as intended when the testsuite transitions to use
asyncio & coroutines instead of multi-threading to concurrently run
test cases. Note that this also raises the minimum Python version to
3.7.
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
req_cmm is more informative than js_skip
|
| |
|
|
|
|
|
|
|
|
|
| |
The testsuite driver used to create one thread per test case, and
explicitly use semaphore and locks for rate limiting and
synchronization. This is a bad practice in any language, and
occasionally may result in livelock conditions (e.g. #22889). This
patch uses concurrent.futures.ThreadPoolExecutor for scheduling test
case runs, which is simpler and more robust.
|
|
|
|
|
| |
This patch simplifies the testsuite driver by removing the use_threads
config field. It's just a degenerate case of threads=1.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Despite Cabal supporting any architecture name, `cabal --check` only
supports a few built-in ones. Sadly `cabal --check` is used by Hackage
hence using any non built-in name in a package (e.g. `arch(js)`) is
rejected and the package is prevented from being uploaded on Hackage.
Luckily built-in support for the `javascript` architecture was added for
GHCJS a while ago. In order to allow newer `base` to be uploaded on
Hackage we make the switch from `js` to `javascript` architecture.
Fixes #22740.
Co-authored-by: Ben Gamari <ben@smart-cactus.org>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
See #22630 and !9552
This commit:
- splits req_smp into req_target_smp and req_ghc_smp
- changes the testsuite driver to calculate req_ghc_smp
- changes a handful of tests to use req_target_smp instead of req_smp
- changes a handful of tests to use req_host_smp when needed
The problem:
- the problem this solves is the ambiguity surrounding req_smp
- on master req_smp was used to express the constraint that the program
being compiled supports smp _and_ that the host RTS (i.e., the RTS used
to compile the program) supported smp. Normally that is fine, but in
cross compilation this is not always the case as was discovered in #22630.
The solution:
- Differentiate the two constraints:
- use req_target_smp to say the RTS the compiled program is linked
with (and the platform) supports smp
- use req_host_smp to say the RTS the host is linked with supports smp
WIP: fix req_smp (target vs ghc)
add flag to separate bootstrapper
split req_smp -> req_target_smp and req_ghc_smp
update tests smp flags
cleanup and add some docstrings
only set ghc_with_smp to bootstrapper on S1 or CC
Only set ghc_with_smp to bootstrapperWithSMP of when testing stage 1
and cross compiling
test the RTS in config/ghc not hadrian
re-add ghc_with_smp
fix and align req names
fix T11760 to use req_host_smp
test the rts directly, avoid python 3.5 limitation
test the compiler in a try block
align out of tree and in tree withSMP flags
mark failing tests as host req smp
testsuite: req_host_smp --> req_ghc_smp
Fix ghc vs host, fix ghc_with_smp leftover
|
|
|
|
|
|
|
| |
Unboxed sums might store a Int8# value as Int64#. This patch
makes sure we keep track of the actual value type.
See Note [Casting slot arguments] for the details.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add JS backend adapted from the GHCJS project by Luite Stegeman.
Some features haven't been ported or implemented yet. Tests for these
features have been disabled with an associated gitlab ticket.
Bump array submodule
Work funded by IOG.
Co-authored-by: Jeffrey Young <jeffrey.young@iohk.io>
Co-authored-by: Luite Stegeman <stegeman@gmail.com>
Co-authored-by: Josh Meredith <joshmeredith2008@gmail.com>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
There are two changes:
* If a pre_cmd fails, then don't attempt to run the test.
* If a pre_cmd fails, then print the stdout and stderr from running that
command (which hopefully has a nice error message).
For example:
```
=====> 1 of 1 [0, 0, 0]
*** framework failure for test-defaulting-plugin(normal) pre_cmd failed: 2
** pre_cmd was "$MAKE -s --no-print-directory -C defaulting-plugin package.test-defaulting-plugin TOP={top}".
stdout:
stderr:
DefaultLifted.hs:19:13: error: [GHC-76037]
Not in scope: type constructor or class ‘Typ’
Suggested fix:
Perhaps use one of these:
‘Type’ (imported from GHC.Tc.Utils.TcType),
data constructor ‘Type’ (imported from GHC.Plugins)
|
19 | instance Eq Typ where
| ^^^
make: *** [Makefile:17: package.test-defaulting-plugin] Error 1
Performance Metrics (test environment: local):
```
Fixes #22329
|
|
|
|
|
| |
Necessary for newer cross-compiling backends (JS, Wasm) that don't
support TH yet.
|
|
|
|
|
|
| |
And use it to avoid T21710a failing on non-tntc archs.
Fixes #22169
|
|
|
|
|
|
|
|
|
|
|
| |
Here we at long last remove the `make`-based build system, it having
been replaced with the Shake-based Hadrian build system. Users are
encouraged to refer to the documentation in `hadrian/doc` and this [1]
blog post for details on using Hadrian.
Closes #17527.
[1] https://www.haskell.org/ghc/blog/20220805-make-to-hadrian.html
|
|
|
|
|
|
| |
This was leading to a bug where we would run a profasm test twice which
led to invalid junit.xml which meant the test results database was not
being populated for the fedora33-perf job.
|
|
|
|
|
|
| |
Previously we would report framework failures of tests marked as fragile
as failures. Now we rather treat them as fragile test failures, which
are not fatal to the testsuite run. Noticed while investigating #21293.
|
| |
|
|
|
|
|
| |
This information about fragile tests is pretty useless but annoying on
CI where you have to scroll up a long way to see the actual issues.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch allows ghc and its dependencies to be built using a normal
invocation of cabal-install. Each componenent which relied on generated
files or additional configuration now has a Setup.hs file.
There are also various fixes to the cabal files to satisfy
cabal-install.
There is a new hadrian command which will build a stage2 compiler and
then a stage3 compiler by using cabal.
```
./hadrian/build build-cabal
```
There is also a new CI job which tests running this command.
For the 9.4 release we will upload all the dependent executables to
hackage and then end users will be free to build GHC and GHC executables
via cabal.
There are still some unresolved questions about how to ensure soundness
when loading plugins into a reinstalled GHC (#20742) which will be
tighted up in due course.
Fixes #19896
|
|
|
|
|
|
|
| |
I made a mistake when implementing #21029 which meant that certain tests
didn't trigger a GHC recompilation. By adding the `test:ghc` target to
the default settings all tests will now depend on this target unless
explicitly opting out via the no_deps modifier.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The main motivation for this patch is to allow tests to be added to the
testsuite which test things about the source tree without needing to
build GHC. In particular the notes linter can easily start failing and
by integrating it into the testsuite the process of observing these
changes is caught by normal validation procedures rather than having to
run the linter specially.
With this patch I can run
```
./hadrian/build test --flavour=devel2 --only="uniques"
```
In a clean tree to run the checkUniques linter without having to build
GHC.
Fixes #21029
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Multiple home units allows you to load different packages which may depend on
each other into one GHC session. This will allow both GHCi and HLS to support
multi component projects more naturally.
Public Interface
~~~~~~~~~~~~~~~~
In order to specify multiple units, the -unit @⟨filename⟩ flag
is given multiple times with a response file containing the arguments for each unit.
The response file contains a newline separated list of arguments.
```
ghc -unit @unitLibCore -unit @unitLib
```
where the `unitLibCore` response file contains the normal arguments that cabal would pass to `--make` mode.
```
-this-unit-id lib-core-0.1.0.0
-i
-isrc
LibCore.Utils
LibCore.Types
```
The response file for lib, can specify a dependency on lib-core, so then modules in lib can use modules from lib-core.
```
-this-unit-id lib-0.1.0.0
-package-id lib-core-0.1.0.0
-i
-isrc
Lib.Parse
Lib.Render
```
Then when the compiler starts in --make mode it will compile both units lib and lib-core.
There is also very basic support for multiple home units in GHCi, at the
moment you can start a GHCi session with multiple units but only the
:reload is supported. Most commands in GHCi assume a single home unit,
and so it is additional work to work out how to modify the interface to
support multiple loaded home units.
Options used when working with Multiple Home Units
There are a few extra flags which have been introduced specifically for
working with multiple home units. The flags allow a home unit to pretend
it’s more like an installed package, for example, specifying the package
name, module visibility and reexported modules.
-working-dir ⟨dir⟩
It is common to assume that a package is compiled in the directory
where its cabal file resides. Thus, all paths used in the compiler
are assumed to be relative to this directory. When there are
multiple home units the compiler is often not operating in the
standard directory and instead where the cabal.project file is
located. In this case the -working-dir option can be passed which
specifies the path from the current directory to the directory the
unit assumes to be it’s root, normally the directory which contains
the cabal file.
When the flag is passed, any relative paths used by the compiler are
offset by the working directory. Notably this includes -i and
-I⟨dir⟩ flags.
-this-package-name ⟨name⟩
This flag papers over the awkward interaction of the PackageImports
and multiple home units. When using PackageImports you can specify
the name of the package in an import to disambiguate between modules
which appear in multiple packages with the same name.
This flag allows a home unit to be given a package name so that you
can also disambiguate between multiple home units which provide
modules with the same name.
-hidden-module ⟨module name⟩
This flag can be supplied multiple times in order to specify which
modules in a home unit should not be visible outside of the unit it
belongs to.
The main use of this flag is to be able to recreate the difference
between an exposed and hidden module for installed packages.
-reexported-module ⟨module name⟩
This flag can be supplied multiple times in order to specify which
modules are not defined in a unit but should be reexported. The
effect is that other units will see this module as if it was defined
in this unit.
The use of this flag is to be able to replicate the reexported
modules feature of packages with multiple home units.
Offsetting Paths in Template Haskell splices
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When using Template Haskell to embed files into your program,
traditionally the paths have been interpreted relative to the directory
where the .cabal file resides. This causes problems for multiple home
units as we are compiling many different libraries at once which have
.cabal files in different directories.
For this purpose we have introduced a way to query the value of the
-working-dir flag to the Template Haskell API. By using this function we
can implement a makeRelativeToProject function which offsets a path
which is relative to the original project root by the value of
-working-dir.
```
import Language.Haskell.TH.Syntax ( makeRelativeToProject )
foo = $(makeRelativeToProject "./relative/path" >>= embedFile)
```
> If you write a relative path in a Template Haskell splice you should use the makeRelativeToProject function so that your library works correctly with multiple home units.
A similar function already exists in the file-embed library. The
function in template-haskell implements this function in a more robust
manner by honouring the -working-dir flag rather than searching the file
system.
Closure Property for Home Units
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For tools or libraries using the API there is one very important closure
property which must be adhered to:
> Any dependency which is not a home unit must not (transitively) depend
on a home unit.
For example, if you have three packages p, q and r, then if p depends on
q which depends on r then it is illegal to load both p and r as home
units but not q, because q is a dependency of the home unit p which
depends on another home unit r.
If you are using GHC by the command line then this property is checked,
but if you are using the API then you need to check this property
yourself. If you get it wrong you will probably get some very confusing
errors about overlapping instances.
Limitations of Multiple Home Units
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
There are a few limitations of the initial implementation which will be smoothed out on user demand.
* Package thinning/renaming syntax is not supported
* More complicated reexports/renaming are not yet supported.
* It’s more common to run into existing linker bugs when loading a
large number of packages in a session (for example #20674, #20689)
* Backpack is not yet supported when using multiple home units.
* Dependency chasing can be quite slow with a large number of
modules and packages.
* Loading wired-in packages as home units is currently not supported
(this only really affects GHC developers attempting to load
template-haskell).
* Barely any normal GHCi features are supported, it would be good to
support enough for ghcid to work correctly.
Despite these limitations, the implementation works already for nearly
all packages. It has been testing on large dependency closures,
including the whole of head.hackage which is a total of 4784 modules
from 452 packages.
Internal Changes
~~~~~~~~~~~~~~~~
* The biggest change is that the HomePackageTable is replaced with the
HomeUnitGraph. The HomeUnitGraph is a map from UnitId to HomeUnitEnv,
which contains information specific to each home unit.
* The HomeUnitEnv contains:
- A unit state, each home unit can have different package db flags
- A set of dynflags, each home unit can have different flags
- A HomePackageTable
* LinkNode: A new node type is added to the ModuleGraph, this is used to
place the linking step into the build plan so linking can proceed in
parralel with other packages being built.
* New invariant: Dependencies of a ModuleGraphNode can be completely
determined by looking at the value of the node. In order to achieve
this, downsweep now performs a more complete job of downsweeping and
then the dependenices are recorded forever in the node rather than
being computed again from the ModSummary.
* Some transitive module calculations are rewritten to use the
ModuleGraph which is more efficient.
* There is always an active home unit, which simplifies modifying a lot
of the existing API code which is unit agnostic (for example, in the
driver).
The road may be bumpy for a little while after this change but the
basics are well-tested.
One small metric increase, which we accept and also submodule update to
haddock which removes ExtendedModSummary.
Closes #10827
-------------------------
Metric Increase:
MultiLayerModules
-------------------------
Co-authored-by: Fendor <power.walross@gmail.com>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The reqlib modifer was supposed to indicate that a test needed a certain
library in order to work. If the library happened to be installed then
the test would run as normal.
However, CI has never run these tests as the packages have not been
installed and we don't want out tests to depend on things which might
get externally broken by updating the compiler.
The new strategy is to run these tests in head.hackage, where the tests
have been cabalised as well as possible. Some tests couldn't be
transferred into the normal style testsuite but it's better than never
running any of the reqlib tests. https://gitlab.haskell.org/ghc/head.hackage/-/merge_requests/169
A few submodules also had reqlib tests and have been updated to remove
it.
Closes #16264 #20032 #17764 #16561
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This place ensures that the default -dcore-lint option is disabled by
default when collect_compiler_stats is used but you can still pass
-dcore-lint as an additional option (see T1969 which tests core lint
performance).
Fixes #20830
-------------------------
Metric Decrease:
PmSeriesS
PmSeriesT
PmSeriesV
T10858
T11195
T11276
T11374
T11822
T14052
T14052Type
T17096
T17836
T17836b
T18478
T18698a
T18698b
-------------------------
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
| |
No-op assignments like R1 = R1 are not only wasteful. They can also
inhibit other optimizations like inlining assignments that read from
R1.
We now check for assignments being a no-op before and after we
simplify the RHS in Cmm sink which should eliminate most of these
no-ops.
|
|
|
|
| |
(#20496)
|
|
|
|
| |
Issues #19072, #17728, #20176
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Previously it was unclear whether req_shared_libs should require:
* that the platform supports dynamic library loading,
* that GHC supports dynamic linking of Haskell code, or
* that the dyn way libraries were built
Clarify by splitting the predicate into two:
* `req_dynamic_lib_support` demands that the platform support dynamic
linking
* `req_dynamic_hs` demands that the GHC support dynamic linking of
Haskell code on the target platform
Naturally `req_dynamic_hs` cannot be true unless
`req_dynamic_lib_support` is also true.
|
|
|
|
|
| |
Previously this attempt at suppressing make's -s flag would mangle
otherwise valid arguments.
|