Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.1k views
in Technique[技术] by (71.8m points)

ios - Face texture from ARKit

I am running a face tracking configuration in ARKit with SceneKit, in each frame i can access the camera feed via the snapshot property or the capturedImage as a buffer, i have also been able to map each face vertex to the image coordinate space and add some UIView helpers(1 point squares) to display in realtime all the face vertices on the screen, like this:

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceGeometry = node.geometry as? ARSCNFaceGeometry,
        let anchorFace = anchor as? ARFaceAnchor,
        anchorFace.isTracked
        else { return }

    let vertices = anchorFace.geometry.vertices

    for (index, vertex) in vertices.enumerated() {
        let vertex = sceneView.projectPoint(node.convertPosition(SCNVector3(vertex), to: nil))
        let xVertex = CGFloat(vertex.x)
        let yVertex = CGFloat(vertex.y)

        let newPosition = CGPoint(x: xVertex, y: yVertex) 
    // Here i update the position of each UIView in the screen with the calculated vertex new position, i have an array of views that matches the vertex count that is consistent across sessions.

    }
}

Since the UV coordinates are also constant across sessions, i am trying to draw for each pixel that is over the face mesh its corresponding position in the UV texture so i can get, after some iterations, a persons face texture to a file.

I have come to some theorical solutions, like creating CGPaths for each triangle, and ask for each pixel if it is contained in that triangle and if it is, create a triangular image, cropping a rectangle and then applying a triangle mask obtained from the points projected by the triangle vertices in the image coordinates, so in this fashion i can obtain a triangular image that has to be translated to the underlying triangle transform (like skewing it in place), and then, in a UIView (1024x1024) add each triangle image as UIImageView as a sub view, and finally encode that UIView as PNG, this sounds like a lot of work, specifically the part of matching the cropped triangle with the UV texture corresponding triangle.

In the Apple demo project there is an image that shows how that UV texture looks like, if you edit this image and add some colors it will then show up in the face, but i need the other way around, from what i am seeing in the camera feed, create a texture of your face, in the same demo project there is an example that does exactly what i need but with a shader, and with no clues on how to extract the texture to a file, the shader codes looks like this:

/*
 <samplecode>
 <abstract>
 SceneKit shader (geometry) modifier for texture mapping ARKit camera video onto the face.
 </abstract>
 </samplecode>
 */

#pragma arguments
float4x4 displayTransform // from ARFrame.displayTransform(for:viewportSize:)

#pragma body

// Transform the vertex to the camera coordinate system.
float4 vertexCamera = scn_node.modelViewTransform * _geometry.position;

// Camera projection and perspective divide to get normalized viewport coordinates (clip space).
float4 vertexClipSpace = scn_frame.projectionTransform * vertexCamera;
vertexClipSpace /= vertexClipSpace.w;

// XY in clip space is [-1,1]x[-1,1], so adjust to UV texture coordinates: [0,1]x[0,1].
// Image coordinates are Y-flipped (upper-left origin).
float4 vertexImageSpace = float4(vertexClipSpace.xy * 0.5 + 0.5, 0.0, 1.0);
vertexImageSpace.y = 1.0 - vertexImageSpace.y;

// Apply ARKit's display transform (device orientation * front-facing camera flip).
float4 transformedVertex = displayTransform * vertexImageSpace;

// Output as texture coordinates for use in later rendering stages.
_geometry.texcoords[0] = transformedVertex.xy;

/**
 * MARK: Post-process special effects
 */

Honestly i do not have much experience with shaders, so any help would be appreciated in translating the shader info on how to translate to a more Cocoa Touch Swift code, right now i am not thinking yet in performance, so it if has to be done in the CPU like in a background thread or offline is ok, anyway i will have to choose the right frames to avoid skewed samples, or triangles with very good information and some other with barely a few pixels stretched (like checking if the normal of the triangle is pointing to the camera, sample it), or other UI helpers to make the user turns the face to sample all the face correctly.

I have already checked this post and this post but cannot get it to work.

This app does exactly what i need, but they do not seem like using ARKit.

Thanks.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...