Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
288 views
in Technique[技术] by (71.8m points)

ios - How is filters UIScrollView/UICollectionView in Apple's Photos app implemented that it opens so fast?

I'm not asking about the exact code but the overall idea.

Here is my problem: I'm trying to create something similar to filter choosing UI in Photos app. I've tried multiple approaches and all of them have their drawbacks.

1) I've tried using Operation and OperationQueue with a collection view, which prefetching is enabled. This loads the viewController fast but drops frames while scrolling.

2) Right now I'm using a scroll view and GCD but it loads the viewController too long (because it applies all filters to all the buttons inside it at once), but then it scrolls smoothly.

NOTE: To answer the question, there is no need to read the below part (I believe), however if you are interested in how I try to implement the functionality you're welcome to read it.

For the implementation of all filters I use a struct called Filters which is responsible for initiating each filter and appending it to an array.

struct Filters {
var image: UIImage
var allFilters: [CIFilter] = []

init(image: UIImage) {

    self.image = image

    guard  let sepia = Sepia(image: image) else {return}

    allFilters.append(contentsOf: [sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia])

       }
  }

Right now I'm using only one filter. Sepia is a subclass of CIFilter. I've created it as a subclass because in future I'm going to create a custom one from it. Here is its implementation:

class Sepia: CIFilter {

var inputImage: CIImage?
var inputIntensity: NSNumber?

@objc override var filterName: String? {
    return NSLocalizedString("Sepia", comment: "Name of a Filter")
}

convenience init?(image: UIImage, inputIntensity: NSNumber? = nil) {
    self.init()

    guard let cgImage = image.cgImage else {
        return nil
    }

    if inputIntensity != nil {
        self.inputIntensity = inputIntensity
    } else {
        self.setDefaults()
    }

    let inputImage = CIImage(cgImage: cgImage)
    self.inputImage = inputImage
}

override func setDefaults() {
    inputIntensity = 1.0
}

override var outputImage: CIImage? {
    guard let inputImage = inputImage, let inputIntensity = inputIntensity else {
        return nil
    }

    let filter = CIFilter(name: "CISepiaTone", withInputParameters: [kCIInputImageKey: inputImage, kCIInputIntensityKey: inputIntensity])

   return filter?.outputImage

  }
}

In the viewController's viewDidLoad I initiate Filters struct:

self.filters = Filters(image: image)

Then I call a method that configures some views (filterViews) based on the number of filters in the filters.allFilters array and iterates over them and calls a method which takes a thumbnail UIImage and applies a filter to it and then returns it in a completion handler (I use DispatchGroup in it for debugging reasons). Here is the method that applies a filter to a thumbnail:

func imageWithFilter(filter: CIFilter, completion: @escaping(UIImage?)->Void) {

    let group = DispatchGroup()
    group.enter()
    DispatchQueue.global().async {
        guard let outputImage = filter.value(forKey: kCIOutputImageKey) as? CIImage, let cgImageResult = self.context.createCGImage(outputImage, from: outputImage.extent) else  {
            DispatchQueue.main.async {
                 completion(nil)
            }
            group.leave()
            return
        }

        let filteredImage = UIImage(cgImage: cgImageResult)
        DispatchQueue.main.async {
            print (filteredImage)
            completion(filteredImage)

        }

       group.leave()
    }

    group.notify(queue: .main) {
        print ("Filteres are set")
    }
}

The print statement above and the filtered image address are printed quite soon, however the images don't appear inside the views.

I have tried to use Time Profiler but it gives me some weird results. For example, it shows the following as taking quite long to execute in the root of backtrace: enter image description here

When I try to see the code in Xcode, I get the following, which doesn't help much: enter image description here

So, this is the problem. If you have any ideas how it is implemented in the Photos app that it is so fast and responsive, or if you have suggestions about my implementation, I would highly appreciate your help.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

The question seems to be how to display the CIImage resulting from a Core Image CIFilter as fast as possible — so fast that it appears instantly when the view controller appears; so fast, in fact, that the user can adjust the CIFilter parameters using sliders and so forth, and the image will redisplay live and keep up with the adjustment.

The answer is to use Metal Kit, and in particular a MTKView. The rendering work is moved off onto the device's GPU and is extremely fast, fast enough to come in under the refresh rate of the device's screen so that there is no discernible lag as the user twiddles the sliders.

I have a simple demonstration where the user applies a custom chain of filters called VignetteFilter:

enter image description here

As the user slides the slider, the amount of vignetting (the inner circle) changes smoothly. At every instant of sliding, a new filter is being applied to the original image and the filter is rendered, over and over as the user slides the slider, keeping up in synch with the user's movements.

The view at the bottom, as I said, is an MTKView. MTKView is not hard to work with in this way; it does require some preparation but it's all boilerplate. The only tricky part is actually getting the image to come out where you want it.

Here's the code for my view controller (I'm omitting everything but the slider and the display of the filtered image):

class EditingViewController: UIViewController, MTKViewDelegate {
    @IBOutlet weak var slider: UISlider!
    @IBOutlet weak var mtkview: MTKView!

    var context : CIContext!
    let displayImage : CIImage! // must be set before viewDidLoad
    let vig = VignetteFilter()
    var queue: MTLCommandQueue!

    // slider value changed
    @IBAction func doSlider(_ sender: Any?) {
        self.mtkview.setNeedsDisplay()
    }

    override func viewDidLoad() {
        super.viewDidLoad()

        // preparation, all pure boilerplate

        self.mtkview.isOpaque = false // otherwise background is black
        // must have a "device"
        guard let device = MTLCreateSystemDefaultDevice() else {
            return
        }
        self.mtkview.device = device

        // mode: draw on demand
        self.mtkview.isPaused = true
        self.mtkview.enableSetNeedsDisplay = true

        self.context = CIContext(mtlDevice: device)
        self.queue = device.makeCommandQueue()

        self.mtkview.delegate = self
        self.mtkview.setNeedsDisplay()
    }

    func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {
    }

    func draw(in view: MTKView) {
        // run the displayImage thru the CIFilter
        self.vig.setValue(self.displayImage, forKey: "inputImage")
        let val = Double(self.slider.value)
        self.vig.setValue(val, forKey:"inputPercentage")
        var output = self.vig.outputImage!

        // okay, `output` is the CIImage we want to display
        // scale it down to aspect-fit inside the MTKView
        var r = view.bounds
        r.size = view.drawableSize
        r = AVMakeRect(aspectRatio: output.extent.size, insideRect: r)
        output = output.transformed(by: CGAffineTransform(
            scaleX: r.size.width/output.extent.size.width, 
            y: r.size.height/output.extent.size.height))
        let x = -r.origin.x
        let y = -r.origin.y

        // minimal dance required in order to draw: render, present, commit
        let buffer = self.queue.makeCommandBuffer()!
        self.context!.render(output,
            to: view.currentDrawable!.texture,
            commandBuffer: buffer,
            bounds: CGRect(origin:CGPoint(x:x, y:y), size:view.drawableSize),
            colorSpace: CGColorSpaceCreateDeviceRGB())
        buffer.present(view.currentDrawable!)
        buffer.commit()
    }
}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...