The error is a bit misleading.
In the first set of code, array2
is implicitly declared as an array of Int
. So any attempt to assign a value to an index of array2
will require an Int
value.
The problem is that Double(value) / 2.0
results in a Double
value, not an Int
. So the compiler is looking for a version of /
that returns an Int
. And that version expects two Int
parameters. Since you are supplying two Double
parameters, you get the error mentioned in your question.
The solution is to either cast the result to an Int
or use two Int
parameters to /
.
var array2 = [8, 7, 19, 20]
for (index, value) in array2.enumerated() {
array2[index] = Int(Double(value) / 2.0) // cast to Int
}
or
var array2 = [8, 7, 19, 20]
for (index, value) in array2.enumerated() {
array2[index] = value / 2 // Use two Int
}
The result will be the same in this case. 8
will be replaced with 4
. 7
will be replaced with 3
, etc.
The second set of code works as-is because you declare the array to be filled with Double
so everything matches up with the correct type.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…