CompilationRepresentationFlags.UseNullAsTrueValue
can be used to
Permit the use of null as a representation for nullary discriminators in a discriminated union
Option.None
is the most prominent example of this.
Why is this useful? How is a null check better than the traditional mechanism for checking union cases (the generated Tag
property)?
It leads to perhaps unexpected behavior:
Some(1).ToString() //"Some(1)"
None.ToString() //NullReferenceException
EDIT
I tested Jack's assertion that comparing to null instead of a static readonly field is faster.
[<CompilationRepresentation(CompilationRepresentationFlags.UseNullAsTrueValue)>]
type T<'T> =
| Z
| X of 'T
let t = Z
Using ILSpy, I can see t
compiles to null (as expected):
public static Test.T<a> t<a>()
{
return null;
}
The test:
let mutable i = 0
for _ in 1 .. 10000000 do
match t with
| Z -> i <- i + 1
| _ -> ()
The results:
Real: 00:00:00.036, CPU: 00:00:00.046, GC gen0: 0, gen1: 0, gen2: 0
If the CompilationRepresentation
attribute is removed, t
becomes a static readonly field:
public static Test.T<a> t<a>()
{
return Test.T<a>.Z;
}
public static Test.T<T> Z
{
[CompilationMapping(SourceConstructFlags.UnionCase, 0)]
get
{
return Test.T<T>._unique_Z;
}
}
internal static readonly Test.T<T> _unique_Z = new Test.T<T>._Z();
And the results are the same:
Real: 00:00:00.036, CPU: 00:00:00.031, GC gen0: 0, gen1: 0, gen2: 0
The pattern match is compiled as t == null
in the former case and t is Z
in the latter.
See Question&Answers more detail:
os