ARKit and RealityKit does definitely have identical values of focal length
parameter. That's because these two frameworks are supposed to work together. And although there's no focal length
instance property for ARView
at the moment, you can easily print in Console a focal length for ARSCNView
or SCNView
.
@IBOutlet var sceneView: ARSCNView!
sceneView.pointOfView?.camera?.focalLength
However, take into account that ARKit, RealityKit and SceneKit frameworks don't use a screen resolution, they rather use a viewport size. A magnification factor for iPhones' viewports is usually 1/2
or 1/3
.
Intrinsic Camera Matrix
As you said in ARKit there's a 3x3 camera matrix allowing you convert between the 2D camera plane and 3D world coordinate space.
var intrinsics: simd_float3x3 { get }
Using this matrix you can print 4 important parameters: fx
, fy
, ox
and oy
. Let's print them all:
DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
print(" Focal Length: (self.sceneView.pointOfView?.camera?.focalLength)")
print("Sensor Height: (self.sceneView.pointOfView?.camera?.sensorHeight)")
// SENSOR HEIGHT IN mm
let frame = self.sceneView.session.currentFrame
// INTRINSICS MATRIX
print("Intrinsics fx: (frame?.camera.intrinsics.columns.0.x)")
print("Intrinsics fy: (frame?.camera.intrinsics.columns.1.y)")
print("Intrinsics ox: (frame?.camera.intrinsics.columns.2.x)")
print("Intrinsics oy: (frame?.camera.intrinsics.columns.2.y)")
}
For iPhone X
the following values are printed:
When you apply your formulas you'll get a implausible result (read on to find out why).
About Wide-Angle Lens and OIS
The iPhone X has two image sensors, and both camera modules are equipped with an optical image stabilizer (OIS). The wide-angle lens offers a 28-millimeter focal length and an aperture of f/1.8
, while the telephoto lens is 56 millimeters and f/2.4
.
ARKit and RealityKit use a wide-angle lens rear module. In iPhone X case it's a 28-mm lens. But what about printed value focal length = 20.78 mm
, huh? I believe that the discrepancy between the value of 28 mm
and 20.78 mm
is due to the fact that video stabilization eats up about 25% of the total image area. This is done in order to eventually get a focal length's value of 28 mm
for final image.
Red frame is a cropping margin at stabilisation stage.
Conclusion
This is my own conclusion. I didn't find any reference materials on that subject, so do not judge me strictly if my opinion is wrong (I admit it may be).
We all know a fact that camera shake is magnified with an increase in focal length. So, the lower value of focal length is, the less camera shake is. It's very important for non-jittering high-quality world tracking in AR app. Also, I firmly believe that Optical Image Stabilisers work much better with lower values of focal length. Hence, it's not a surprise that ARKit engineers have chosen a lower value of focal length
for AR experience (capturing a wider image area), and then after stabilization, we get a modified version of the image, like it has focal length = 28 mm
.
So, in my humble opinion, it makes no sense to calculate a REAL focal length
for RealityKit and ARKit 'cause there is a "FAKE" focal length
already implemented by Apple engineers for a robust AR experience.