What is the monomorphism restriction?
The monomorphism restriction as stated by the Haskell wiki is:
a counter-intuitive rule in Haskell type inference.
If you forget to provide a type signature, sometimes this rule will fill
the free type variables with specific types using "type defaulting" rules.
What this means is that, in some circumstances, if your type is ambiguous (i.e. polymorphic)
the compiler will choose to instantiate that type to something not ambiguous.
How do I fix it?
First of all you can always explicitly provide a type signature and this will
avoid the triggering of the restriction:
plus :: Num a => a -> a -> a
plus = (+) -- Okay!
-- Runs as:
Prelude> plus 1.0 1
2.0
Alternatively, if you are defining a function, you can avoid point-free style,
and for example write:
plus x y = x + y
Turning it off
It is possible to simply turn off the restriction so that you don't have to do
anything to your code to fix it. The behaviour is controlled by two extensions:
MonomorphismRestriction
will enable it (which is the default) while
NoMonomorphismRestriction
will disable it.
You can put the following line at the very top of your file:
{-# LANGUAGE NoMonomorphismRestriction #-}
If you are using GHCi you can enable the extension using the :set
command:
Prelude> :set -XNoMonomorphismRestriction
You can also tell ghc
to enable the extension from the command line:
ghc ... -XNoMonomorphismRestriction
Note: You should really prefer the first option over choosing extension via command-line options.
Refer to GHC's page for an explanation of this and other extensions.
A complete explanation
I'll try to summarize below everything you need to know to understand what the
monomorphism restriction is, why it was introduced and how it behaves.
An example
Take the following trivial definition:
plus = (+)
you'd think to be able to replace every occurrence of +
with plus
. In particular since (+) :: Num a => a -> a -> a
you'd expect to also have plus :: Num a => a -> a -> a
.
Unfortunately this is not the case. For example if we try the following in GHCi:
Prelude> let plus = (+)
Prelude> plus 1.0 1
We get the following output:
<interactive>:4:6:
No instance for (Fractional Integer) arising from the literal ‘1.0’
In the first argument of ‘plus’, namely ‘1.0’
In the expression: plus 1.0 1
In an equation for ‘it’: it = plus 1.0 1
You may need to :set -XMonomorphismRestriction
in newer GHCi versions.
And in fact we can see that the type of plus
is not what we would expect:
Prelude> :t plus
plus :: Integer -> Integer -> Integer
What happened is that the compiler saw that plus
had type Num a => a -> a -> a
, a polymorphic type.
Moreover it happens that the above definition falls under the rules that I'll explain later and so
he decided to make the type monomorphic by defaulting the type variable a
.
The default is Integer
as we can see.
Note that if you try to compile the above code using ghc
you won't get any errors.
This is due to how ghci
handles (and must handle) the interactive definitions.
Basically every statement entered in ghci
must be completely type checked before
the following is considered; in other words it's as if every statement was in a separate
module. Later I'll explain why this matter.
Some other example
Consider the following definitions:
f1 x = show x
f2 = x -> show x
f3 :: (Show a) => a -> String
f3 = x -> show x
f4 = show
f5 :: (Show a) => a -> String
f5 = show
We'd expect all these functions to behave in the same way and have the same type,
i.e. the type of show
: Show a => a -> String
.
Yet when compiling the above definitions we obtain the following errors:
test.hs:3:12:
No instance for (Show a1) arising from a use of ‘show’
The type variable ‘a1’ is ambiguous
Relevant bindings include
x :: a1 (bound at blah.hs:3:7)
f2 :: a1 -> String (bound at blah.hs:3:1)
Note: there are several potential instances:
instance Show Double -- Defined in ‘GHC.Float’
instance Show Float -- Defined in ‘GHC.Float’
instance (Integral a, Show a) => Show (GHC.Real.Ratio a)
-- Defined in ‘GHC.Real’
...plus 24 others
In the expression: show x
In the expression: x -> show x
In an equation for ‘f2’: f2 = x -> show x
test.hs:8:6:
No instance for (Show a0) arising from a use of ‘show’
The type variable ‘a0’ is ambiguous
Relevant bindings include f4 :: a0 -> String (bound at blah.hs:8:1)
Note: there are several potential instances:
instance Show Double -- Defined in ‘GHC.Float’
instance Show Float -- Defined in ‘GHC.Float’
instance (Integral a, Show a) => Show (GHC.Real.Ratio a)
-- Defined in ‘GHC.Real’
...plus 24 others
In the expression: show
In an equation for ‘f4’: f4 = show
So f2
and f4
don't compile. Moreover when trying to define these function
in GHCi we get no errors, but the type for f2
and f4
is () -> String
!
Monomorphism restriction is what makes f2
and f4
require a monomorphic
type, and the different behaviour bewteen ghc
and ghci
is due to different
defaulting rules.
When does it happen?
In Haskell, as defined by the report, there are two distinct type of bindings.
Function bindings and pattern bindings. A function binding is nothing else than
a definition of a function:
f x = x + 1
Note that their syntax is:
<identifier> arg1 arg2 ... argn = expr
Modulo guards and where
declarations. But they don't really matter.
where there must be at least one argument.
A pattern binding is a declaration of the form:
<pattern> = expr
Again, modulo guards.
Note that variables are patterns, so the binding:
plus = (+)
is a pattern binding. It's binding the pattern plus
(a variable) to the expression (+)
.
When a pattern binding consists of only a variable name it's called a
simple pattern binding.
The monomorphism restriction applies to simple pattern bindings!
Well, formally we should say that:
A declaration group is a minimal set of mutually dependent bindings.
Section 4.5.1 of the report.
And then (Section 4.5.5 of the report):
a given declaration group is unrestricted if and only if:
every variable in the group is bound by a function binding (e.g. f x = x
)
or a simple pattern binding (e.g. plus = (+)
; Section 4.4.3.2 ), and
an explicit type signature is given for every variable in the group that
is bound by simple pattern binding. (e.g. plus :: Num a => a -> a -> a; plus = (+)
).
Examples added by me.
So a restricted declaration group is a group where, either there are
non-simple pattern bindings (e.g. (x:xs) = f something
or (f, g) = ((+), (-))
) or
there is some simple pattern binding without a type signature (as in plus = (+)
).
The monomorphism restriction affects restricted declaration groups.
Most of the time you don't define mutual recursive functions and hence a declaration
group becomes just a binding.
What does it do?
The monomorphism restriction is described by two rules in Section
4.5.5 of the report.
First rule
The usual Hindley-Milner restriction on polymorphism is that only type
variables that do not occur free in the environment may be generalized.
In addition, the constrained type variables of a restricted declaration
group may not be generalized in the generalization step for that group.
(Recall that a type variable is constrained if it must belong to some
type class; see Section 4.5.2 .)
The highlighted part is what the monomorphism restriction introduces.
It says that if the type is polymorphic (i.e. it contain some type variable)
and that type variable is constrained (i.e. it has a class constraint on it:
e.g. the type Num a => a -> a -> a
is polymorphic because it contains a
and
also contrained because the a
has the constraint Num
over it.)
then it cannot be generalized.
In simple words not generalizing means that the uses of the function plus
may change its type.
If you had the definitions:
plus = (+)
x :: Integer
x = plus 1 2
y :: Double
y = plus 1.0 2
then you'd get a type error. Because when the compiler sees that plus
is
called over an Integer
in the declaration of x
it will unify the type
variable a
with Integer
and hence the type of plus
becomes:
Integer -> Integer -> Integer
but then, when it will type check the definition of y
, it will see that plus
is applied to a Double
argument, and the types don't match.
Note that you can still use plus
without getting an error:
plus = (+)
x = plus 1.0 2
In this case the type of plus
is first inferred to be Num a => a -> a -> a
but then its use in the definition of x
, where 1.0
requires a Fractional