\documentclass{article} \usepackage{supertabular} \usepackage{amsmath} \usepackage{amssymb} \usepackage{stmaryrd} \usepackage{xcolor} \usepackage{fullpage} \usepackage{multirow} \usepackage{url} \newcommand{\ghcfile}[1]{\textsl{#1}} \newcommand{\arraylabel}[1]{\multicolumn{2}{l}{\!\!\!\!\!\!\!\!\!\text{\underline{#1}:}}} \input{CoreOtt} % increase spacing between rules for ease of reading: \renewcommand{\ottusedrule}[1]{\[#1\]\\[1ex]} \setlength{\parindent}{0in} \setlength{\parskip}{1ex} \newcommand{\gram}[1]{\ottgrammartabular{#1\ottafterlastrule}} \begin{document} \begin{center} \LARGE System FC, as implemented in GHC\footnote{This document was originally prepared by Richard Eisenberg (\texttt{eir@cis.upenn.edu}), but it should be maintained by anyone who edits the functions or data structures mentioned in this file. Please feel free to contact Richard for more information.}\\ \Large 23 April 2015 \end{center} \section{Introduction} This document presents the typing system of System FC, very closely to how it is implemented in GHC. Care is taken to include only those checks that are actually written in the GHC code. It should be maintained along with any changes to this type system. Who will use this? Any implementer of GHC who wants to understand more about the type system can look here to see the relationships among constructors and the different types used in the implementation of the type system. Note that the type system here is quite different from that of Haskell---these are the details of the internal language, only. At the end of this document is a \emph{hypothetical} operational semantics for GHC. It is hypothetical because GHC does not strictly implement a concrete operational semantics anywhere in its code. While all the typing rules can be traced back to lines of real code, the operational semantics do not, in general, have as clear a provenance. There are a number of details elided from this presentation. The goal of the formalism is to aid in reasoning about type safety, and checks that do not work toward this goal were omitted. For example, various scoping checks (other than basic context inclusion) appear in the GHC code but not here. \section{Grammar} \subsection{Metavariables} We will use the following metavariables: \ottmetavars{}\\ \subsection{Literals} Literals do not play a major role, so we leave them abstract: \gram{\ottlit} We also leave abstract the function \coderef{basicTypes/Literal.lhs}{literalType} and the judgment \coderef{coreSyn/CoreLint.lhs}{lintTyLit} (written $[[G |-tylit lit : k]]$). \subsection{Variables} \enlargethispage{10pt} % without this first line of "z" definition is placed on % second page and it becomes the only line of text on that % page, resulting in whole page being empty. GHC uses the same datatype to represent term-level variables and type-level variables: \gram{ \ottz } \gram{ \ottn } \subsection{Expressions} The datatype that represents expressions: \gram{\otte} There are a few key invariants about expressions: \begin{itemize} \item The right-hand sides of all top-level and recursive $[[let]]$s must be of lifted type. \item The right-hand side of a non-recursive $[[let]]$ and the argument of an application may be of unlifted type, but only if the expression is ok-for-speculation. See \verb|#let_app_invariant#| in \ghcfile{coreSyn/CoreSyn.lhs}. \item We allow a non-recursive $[[let]]$ for bind a type variable. \item The $[[_]]$ case for a $[[case]]$ must come first. \item The list of case alternatives must be exhaustive. \item Types and coercions can only appear on the right-hand-side of an application. \end{itemize} Bindings for $[[let]]$ statements: \gram{\ottbinding} Case alternatives: \gram{\ottalt} Constructors as used in patterns: \gram{\ottKp} Notes that can be inserted into the AST. We leave these abstract: \gram{\otttick} A program is just a list of bindings: \gram{\ottprogram} \subsection{Types} \gram{\ottt} There are some invariants on types: \begin{itemize} \item The type $[[t1]]$ in the form $[[t1 t2]]$ must not be a type constructor $[[T]]$. It should be another application or a type variable. \item The form $[[T ]]$ (\texttt{TyConApp}) does \emph{not} need to be saturated. \item A saturated application of $[[(->) t1 t2]]$ should be represented as $[[t1 -> t2]]$. This is a different point in the grammar, not just pretty-printing. The constructor for a saturated $[[(->)]]$ is \texttt{FunTy}. \item A type-level literal is represented in GHC with a different datatype than a term-level literal, but we are ignoring this distinction here. \end{itemize} \subsection{Coercions} \gram{\ottg} Invariants on coercions: \begin{itemize} \item $[[_R]]$ is used; never $[[_R _Nom]]$. \item If $[[_R]]$ is applied to some coercions, at least one of which is not reflexive, use $[[T_R ]]$, never $[[_R g1 g2]] \ldots$. \item The $[[T]]$ in $[[T_R ]]$ is never a type synonym, though it could be a type function. \end{itemize} Roles label what equality relation a coercion is a witness of. Nominal equality means that two types are identical (have the same name); representational equality means that two types have the same representation (introduced by newtypes); and phantom equality includes all types. See \url{http://ghc.haskell.org/trac/ghc/wiki/Roles} for more background. \gram{\ottR} Is it a left projection or a right projection? \gram{\ottLorR} Axioms: \gram{ \ottC\ottinterrule \ottaxBranch } The definition for $[[axBranch]]$ above does not include the list of incompatible branches (field \texttt{cab\_incomps} of \texttt{CoAxBranch}), as that would unduly clutter this presentation. Instead, as the list of incompatible branches can be computed at any time, it is checked for in the judgment $[[no_conflict]]$. See Section~\ref{sec:no_conflict}. Axiom rules, produced by the type-nats solver: \gram{\ottmu} \label{sec:axiom_rules} An axiom rule $[[mu]] = [[M(I, role_list, R')]]$ is an axiom name $[[M]]$, with a type arity $[[I]]$, a list of roles $[[role_list]]$ for its coercion parameters, and an output role $[[R']]$. The definition within GHC also includes a field named $[[coaxrProves]]$ which computes the output coercion from a list of types and a list of coercions. This is elided in this presentation, as we simply identify axiom rules by their names $[[M]]$. See also \coderef{typecheck/TcTypeNats.lhs}{mkBinAxiom} and \coderef{typecheck/TcTypeNats.lhs}{mkAxiom1}. In \ottdrulename{Co\_UnivCo}, function $ \textsf{compatibleUnBoxedTys} $ stands for following checks: \begin{itemize} \item both types are unboxed; \item types should have same size; \item both types should be either integral or floating; \item coercion between vector types are not allowed; \item unboxed tuples should have same length and each element should be coercible to appropriate element of the target tuple; \end{itemize} For function implementation see \coderef{coreSyn/CoreLint.lhs}{checkTypes}. For futher discussion see \url{https://ghc.haskell.org/trac/ghc/wiki/BadUnsafeCoercions}. \subsection{Type constructors} Type constructors in GHC contain \emph{lots} of information. We leave most of it out for this formalism: \gram{\ottT} We include some representative primitive type constructors. There are many more in \ghcfile{prelude/TysPrim.lhs}. \gram{\ottH} \section{Contexts} The functions in \ghcfile{coreSyn/CoreLint.lhs} use the \texttt{LintM} monad. This monad contains a context with a set of bound variables $[[G]]$. The formalism treats $[[G]]$ as an ordered list, but GHC uses a set as its representation. \gram{ \ottG } We assume the Barendregt variable convention that all new variables are fresh in the context. In the implementation, of course, some work is done to guarantee this freshness. In particular, adding a new type variable to the context sometimes requires creating a new, fresh variable name and then applying a substitution. We elide these details in this formalism, but see \coderef{types/Type.lhs}{substTyVarBndr} for details. \section{Typing judgments} The following functions are used from GHC. Their names are descriptive, and they are not formalized here: \coderef{types/TyCon.lhs}{tyConKind}, \coderef{types/TyCon.lhs}{tyConArity}, \coderef{basicTypes/DataCon.lhs}{dataConTyCon}, \coderef{types/TyCon.lhs}{isNewTyCon}, \coderef{basicTypes/DataCon.lhs}{dataConRepType}. \subsection{Program consistency} Check the entire bindings list in a context including the whole list. We extract the actual variables (with their types/kinds) from the bindings, check for duplicates, and then check each binding. \ottdefnlintCoreBindings{} Here is the definition of $[[vars_of]]$, taken from \coderef{coreSyn/CoreSyn.lhs}{bindersOf}: \[ \begin{array}{ll} [[vars_of n = e]] &= [[n]] \\ [[vars_of rec ]] &= [[]] \end{array} \] \subsection{Binding consistency} \ottdefnlintXXbind{} \ottdefnlintSingleBinding{} In the GHC source, this function contains a number of other checks, such as for strictness and exportability. See the source code for further information. \subsection{Expression typing} \ottdefnlintCoreExpr{} \begin{itemize} \item Some explication of \ottdrulename{Tm\_LetRec} is helpful: The idea behind the second premise ($[[]]$) is that we wish to check each substituted type $[[s'i]]$ in a context containing all the types that come before it in the list of bindings. The $[[G'i]]$ are contexts containing the names and kinds of all type variables (and term variables, for that matter) up to the $i$th binding. This logic is extracted from \coderef{coreSyn/CoreLint.lhs}{lintAndScopeIds}. \item There is one more case for $[[G |-tm e : t]]$, for type expressions. This is included in the GHC code but is elided here because the case is never used in practice. Type expressions can only appear in arguments to functions, and these are handled in \ottdrulename{Tm\_AppType}. \item The GHC source code checks all arguments in an application expression all at once using \coderef{coreSyn/CoreSyn.lhs}{collectArgs} and \coderef{coreSyn/CoreLint.lhs}{lintCoreArgs}. The operation has been unfolded for presentation here. \item If a $[[tick]]$ contains breakpoints, the GHC source performs additional (scoping) checks. \item The rule for $[[case]]$ statements also checks to make sure that the alternatives in the $[[case]]$ are well-formed with respect to the invariants listed above. These invariants do not affect the type or evaluation of the expression, so the check is omitted here. \item The GHC source code for \ottdrulename{Tm\_Var} contains checks for a dead id and for one-tuples. These checks are omitted here. \end{itemize} \subsection{Kinding} \ottdefnlintType{} \subsection{Kind validity} \ottdefnlintKind{} \subsection{Coercion typing} In the coercion typing judgment, the $\#$ marks are left off the equality operators to reduce clutter. This is not actually inconsistent, because the GHC function that implements this check, \texttt{lintCoercion}, actually returns four separate values (the kind, the two types, and the role), not a type with head $[[(~#)]]$ or $[[(~R#)]]$. Note that the difference between these two forms of equality is interpreted in the rules \ottdrulename{Co\_CoVarCoNom} and \ottdrulename{Co\_CoVarCoRepr}. \ottdefnlintCoercion{} In \ottdrulename{Co\_AxiomInstCo}, the use of $[[inits]]$ creates substitutions from the first $i$ mappings in $[[ si] // i /> ]]$. This has the effect of folding the substitution over the kinds for kind-checking. See Section~\ref{sec:tyconroles} for more information about $[[tyConRolesX]]$, and see Section~\ref{sec:axiom_rules} for more information about $[[coaxrProves]]$. \subsection{Name consistency} There are two very similar checks for names, one declared as a local function: \ottdefnlintSingleBindingXXlintBinder{} \ottdefnlintBinder{} \subsection{Substitution consistency} \ottdefncheckTyKind{} \subsection{Case alternative consistency} \ottdefnlintCoreAlt{} \subsection{Telescope substitution} \ottdefnapplyTys{} \subsection{Case alternative binding consistency} \ottdefnlintAltBinders{} \subsection{Arrow kinding} \ottdefnlintArrow{} \subsection{Type application kinding} \ottdefnlintXXapp{} \subsection{Sub-kinding} \ottdefnisSubKind{} \subsection{Roles} \label{sec:tyconroles} During type-checking, role inference is carried out, assigning roles to the arguments of every type constructor. The function $[[tyConRoles]]$ extracts these roles. Also used in other judgments is $[[tyConRolesX]]$, which is the same as $[[tyConRoles]]$, but with an arbitrary number of $[[Nom]]$ at the end, to account for potential oversaturation. The checks encoded in the following judgments are run from \coderef{typecheck/TcTyClsDecls.lhs}{checkValidTyCon} when \texttt{-dcore-lint} is set. \ottdefncheckValidRoles{} \ottdefncheckXXdcXXroles{} In the following judgment, the role $[[R]]$ is an \emph{input}, not an output. The metavariable $[[O]]$ denotes a \emph{role context}, as shown here: \gram{\ottO} \ottdefncheckXXtyXXroles{} These judgments depend on a sub-role relation: \ottdefnltRole{} \subsection{Branched axiom conflict checking} \label{sec:no_conflict} The following judgment is used within \ottdrulename{Co\_AxiomInstCo} to make sure that a type family application cannot unify with any previous branch in the axiom. The actual code scans through only those branches that are flagged as incompatible. These branches are stored directly in the $[[axBranch]]$. However, it is cleaner in this presentation to simply check for compatibility here. \ottdefncheckXXnoXXconflict{} The judgment $[[apart]]$ checks to see whether two lists of types are surely apart. $[[apart( , )]]$, where $[[ ]]$ is a list of types and $[[ ]]$ is a list of type \emph{patterns} (as in a type family equation), first flattens the $[[ ]]$ using \coderef{types/FamInstEnv.lhs}{flattenTys} and then checks to see if \coderef{types/Unify.lhs}{tcUnifyTysFG} returns \texttt{SurelyApart}. Flattening takes all type family applications and replaces them with fresh variables, taking care to map identical type family applications to the same fresh variable. The algorithm $[[unify]]$ is implemented in \coderef{types/Unify.lhs}{tcUnifyTys}. It performs a standard unification, returning a substitution upon success. \section{Operational semantics} \subsection{Disclaimer} GHC does not implement an operational semantics in any concrete form. Most of the rules below are implied by algorithms in, for example, the simplifier and optimizer. Yet, there is no one place in GHC that states these rules, analogously to \texttt{CoreLint.lhs}. Nevertheless, these rules are included in this document to help the reader understand System FC. \subsection{The context $[[S]]$} We use a context $[[S]]$ to keep track of the values of variables in a (mutually) recursive group. Its definition is as follows: \[ [[S]] \quad ::= \quad [[ empty ]] \ |\ [[S]], [[ [n |-> e] ]] \] The presence of the context $[[S]]$ is solely to deal with recursion. If your use of FC does not require modeling recursion, you will not need to track $[[S]]$. \subsection{Operational semantics rules} \ottdefnstep{} \subsection{Notes} \begin{itemize} \item The \ottdrulename{S\_LetRec} rules implement recursion. \ottdrulename{S\_LetRec} adds to the context $[[S]]$ bindings for all of the mutually recursive equations. Then, after perhaps many steps, when the body of the $[[let]]\ [[rec]]$ contains no variables that are bound in the $[[let]]\ [[rec]]$, the context is popped in \ottdrulename{S\_LetRecReturn}. The other \ottdrulename{S\_LetRecXXX} rules are there to prevent reduction from getting stuck. \item In the $[[case]]$ rules, a constructor $[[K]]$ is written taking three lists of arguments: two lists of types and a list of terms. The types passed in are the universally and, respectively, existentially quantified type variables to the constructor. The terms are the regular term arguments stored in an algebraic datatype. Coercions (say, in a GADT) are considered term arguments. \item The rule \ottdrulename{S\_CasePush} is the most complex rule. \begin{itemize} \item The logic in this rule is implemented in \coderef{coreSyn/CoreSubst.lhs}{exprIsConApp\_maybe}. \item The $[[coercionKind]]$ function (\coderef{types/Coercion.lhs}{coercionKind}) extracts the two types (and their kind) from a coercion. It does not require a typing context, as it does not \emph{check} the coercion, just extracts its types. \item The $[[dataConRepType]]$ function (\coderef{basicTypes/DataCon.lhs}{dataConRepType}) extracts the full type of a data constructor. Following the notation for constructor expressions, the parameters to the constructor are broken into three groups: universally quantified types, existentially quantified types, and terms. \item The substitutions in the last premise to the rule are unusual: they replace \emph{type} variables with \emph{coercions}. This substitution is called lifting and is implemented in \coderef{types/Coercion.lhs}{liftCoSubst}. The notation is essentially a pun on the fact that types and coercions have such similar structure. \item Note that the types $[[ ]]$---the existentially quantified types---do not change during this step. \end{itemize} \end{itemize} \end{document}