我正在尝试将 CVPixelBuffer 的大小调整为 128x128。我正在使用 750x750 的显示器。我目前正在使用 CVPixelBuffer 创建一个新的 CGImage,我调整它的大小然后将其转换回 CVPixelBuffer。这是我的代码:
func getImageFromSampleBuffer (buffer:CMSampleBuffer) -> UIImage? {
if let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) {
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
let context = CIContext()
let imageRect = CGRect(x: 0, y: 0, width: 128, height: 128)
if let image = context.createCGImage(ciImage, from: imageRect) {
let t = CIImage(cgImage: image)
let new = t.applying(transformation)
context.render(new, to: pixelBuffer)
return UIImage(cgImage: image, scale: UIScreen.main.scale, orientation: .right)
}
}
return nil
}
我也试过缩放 CIImage 然后转换它:
let t = CIImage(cgImage: image)
let transformation = CGAffineTransform(scaleX: 1, y: 2)
let new = t.applying(transformation)
context.render(new, to: pixelBuffer)
但这也没有用。
感谢任何帮助。谢谢!
Best Answer-推荐答案 strong>
不需要像素缓冲区渲染等。只需转换原始 CIImage 并裁剪到合适的大小。如果源维度和目标维度不成比例,则需要裁剪。
func getImageFromSampleBuffer (buffer:CMSampleBuffer) -> UIImage? {
if let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) {
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
let srcWidth = CGFloat(ciImage.extent.width)
let srcHeight = CGFloat(ciImage.extent.height)
let dstWidth: CGFloat = 128
let dstHeight: CGFloat = 128
let scaleX = dstWidth / srcWidth
let scaleY = dstHeight / srcHeight
let scale = min(scaleX, scaleY)
let transform = CGAffineTransform.init(scaleX: scale, y: scale)
let output = ciImage.transformed(by: transform).cropped(to: CGRect(x: 0, y: 0, width: dstWidth, height: dstHeight))
return UIImage(ciImage: output)
}
return nil
}
关于ios - 调整 CVPixelBuffer 的大小,我们在Stack Overflow上找到一个类似的问题:
https://stackoverflow.com/questions/44509385/
|