As already mentioned in the comments, a Double
cannot store
the value 1.1
exactly. Swift uses (like many other languages)
binary floating point numbers according to the IEEE 754
standard.
The closest number to 1.1
that can be represented as a Double
is
1.100000000000000088817841970012523233890533447265625
and the closest number to 2.3
that can be represented as a Double
is
2.29999999999999982236431605997495353221893310546875
Printing that number means that it is converted to a string with
a decimal representation again, and that is done with different
precision, depending on how you print the number.
From the source code at HashedCollections.swift.gyb one can see that the description
method of
Dictionary
uses debugPrint()
for both keys and values,
and debugPrint(x)
prints the value of x.debugDescription
(if x
conforms to CustomDebugStringConvertible
).
On the other hand, print(x)
calls x.description
if x
conforms
to CustomStringConvertible
.
So what you see is the different output of description
and debugDescription
of Double
:
print(1.1.description) // 1.1
print(1.1.debugDescription) // 1.1000000000000001
From the Swift source code one can see
that both use the swift_floatingPointToString()
function in Stubs.cpp, with the Debug
parameter set to false
and true
, respectively.
This parameter controls the precision of the number to string conversion:
int Precision = std::numeric_limits<T>::digits10;
if (Debug) {
Precision = std::numeric_limits<T>::max_digits10;
}
For the meaning of those constants, see std::numeric_limits:
digits10
– number of decimal digits that can be represented without change,
max_digits10
– number of decimal digits necessary to differentiate all values of this type.
So description
creates a string with less decimal digits. That
string can be converted to a Double
and back to a string giving
the same result.
debugDescription
creates a string with more decimal digits, so that
any two different floating point values will produce a different output.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…