对于 7 年后查看此线程的任何人,我尝试使用上述方法,但它们产生的结果相对较慢。我想在屏幕外渲染整个 SKScene,因为我将图像渲染为视频的帧。这意味着我不能使用 LearnCocos2D 建议的第一种方法,因为它需要在屏幕上绘制视图,而第二种方法需要很长时间才能SKTexture
从UIImage
. 您可以使用SKRenderer
iOS 11.0 中引入的新类将整个场景渲染到 UIImage,它利用了Metal,因此渲染速度相对较快。我能够SKScene
在大约 0.013 秒内渲染 1920x1080!
你可以使用这个扩展:
- 确保你
import MetalKit
- 该
ignoreScreenScale
参数指定输出图像是否应该是像素精确的。通常,如果您要在屏幕上显示图像,您会希望这是错误的。如果为假,则输出图像的大小将按设备的比例缩放,以使场景上的每个“点”在图像中占据与屏幕上相同数量的像素。当这是真的时,以像素为单位的输出图像的大小等于SKScene
in 点的大小。
干杯!
extension SKScene {
func toImage(ignoreScreenScale: Bool = false) -> UIImage? {
guard let device = MTLCreateSystemDefaultDevice(),
let commandQueue = device.makeCommandQueue(),
let commandBuffer = commandQueue.makeCommandBuffer() else { return nil }
let scale = ignoreScreenScale ? 1 : UIScreen.main.scale
let size = self.size.applying(CGAffineTransform(scaleX: scale, y: scale))
let renderer = SKRenderer(device: device)
let renderPassDescriptor = MTLRenderPassDescriptor()
var r = CGFloat.zero, g = CGFloat.zero, b = CGFloat.zero, a = CGFloat.zero
backgroundColor.getRed(&r, green: &g, blue: &b, alpha: &a)
let textureDescriptor = MTLTextureDescriptor()
textureDescriptor.usage = [.renderTarget, .shaderRead]
textureDescriptor.width = Int(size.width)
textureDescriptor.height = Int(size.height)
let texture = device.makeTexture(descriptor: textureDescriptor)
renderPassDescriptor.colorAttachments[0].loadAction = .clear
renderPassDescriptor.colorAttachments[0].texture = texture
renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
red: Double(r),
green: Double(g),
blue: Double(b),
alpha:Double(a)
)
renderer.scene = self
renderer.render(withViewport: CGRect(origin: .zero, size: size), commandBuffer: commandBuffer, renderPassDescriptor: renderPassDescriptor)
commandBuffer.commit()
let image = CIImage(mtlTexture: texture!, options: nil)!
let transformed = image.transformed(by: CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -image.extent.size.height))
return UIImage(ciImage: transformed)
}
}