summaryrefslogtreecommitdiff
path: root/rules/dependencies.mk
diff options
context:
space:
mode:
authorAustin Seipp <austin@well-typed.com>2015-05-11 07:10:22 -0500
committerAustin Seipp <austin@well-typed.com>2015-05-11 07:14:20 -0500
commit9736c042f4292b4fb94ca9faca6a010372a0f92f (patch)
treea5a5f09d98997eceadd6267d53338899761efd1f /rules/dependencies.mk
parentcf7573b8207bbb17c58612f3345e0b17d74cfb58 (diff)
downloadhaskell-9736c042f4292b4fb94ca9faca6a010372a0f92f.tar.gz
compiler: make sure we reject -O + HscInterpreted
When using GHCi, we explicitly reject optimization, because the compilers optimization passes can introduce unboxed tuples, which the interpreter is not able to handle. But this goes the other way too: using GHCi on optimized code may cause the optimizer to float out breakpoints that the interpreter introduces. This manifests itself in weird ways, particularly if you as an API client use custom DynFlags to introduce optimization in combination with HscInterpreted. It turns out we weren't checking for consistent DynFlag settings when doing `setSessionDynFlags`, as #10052 showed. While the main driver handled it in `DynFlags` via `parseDynamicFlags`, we didn't check this elsewhere. This does a little refactoring to split out some of the common code, and immunizes the various `DynFlags` utilities in the `GHC` module from this particular bug. We should probably be checking other general invariants too. This fixes #10052, and adds some notes about the behavior in `GHC` and `FloatOut` As a bonus, expose `warningMsg` from `ErrUtils` as a helper since it didn't exist (somehow). Signed-off-by: Austin Seipp <austin@well-typed.com> Reviewed By: edsko Differential Revision: https://phabricator.haskell.org/D727 GHC Trac Issues: #10052
Diffstat (limited to 'rules/dependencies.mk')
0 files changed, 0 insertions, 0 deletions