7

iPhone 12/12 pro 支持以 10 位格式而不是 8 位录制杜比视界 HDR 视频,但从 iOS 14.1 SDK 中尚不清楚 AVCaptureVideoDataOutput 是否支持交付可使用 AVAssetWriter 附加到视频文件的 10 位样本缓冲区。有没有人弄清楚它是否可能在SDK中?

编辑:许多应用程序,如 Apple 的 Clips 应用程序已经开始支持杜比视界 10 位视频录制。但我尝试了所有可用的 API,包括 videoHDREnabled,但它不起作用。因此,明确的问题是如何使用 AVFoundation API 录制 HDR(杜比视界)视频?

EDIT2:我能够找出支持 10 位像素缓冲区格式的设备格式(即“x420”,而不是以 420v 或 420f 作为媒体子类型的设备格式)。在 iPhone 12 mini 上,4 种设备格式支持 kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange 中的 10 位像素缓冲区交付,即使 AVFoundation 文档说这不是受支持的像素格式(引用 - “iOS 上,唯一支持的键是 kCVPixelBufferPixelFormatTypeKey。支持的像素格式是 kCVPixelFormatType_420YpCbCr8BiFullPlanarVideoRange、kCVPixelFormatPlanType_420YpCbCrArnge和 kCVPixelFormatType_32BGRA。”)。下一步是确定是否可以手动选择用于录制的 HDR 格式为杜比视界、HLG 或 HDR10。

4

5 回答 5

4

好的,给出的答案都不正确,所以我在拿起 iPhone 12 mini 后进行了研究,这就是我发现的。

AVFoundation 文档是沉默的,有时甚至是不正确的。可以从文档中推断出不可能获得 10 位 HDR 样本缓冲区,特别是如果阅读AVCaptureVideoDataOutput的videoSettings属性的文档:

   On iOS, the only supported key is kCVPixelBufferPixelFormatTypeKey. 
   Supported pixel formats are kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, 
   kCVPixelFormatType_420YpCbCr8BiPlanarFullRange and kCVPixelFormatType_32BGRA

从文档中可以看出,永远无法获得 10 位帧。但是在探测-[AVCaptureDevice formats]时,可以找到 4 种不同的格式,并且 mediaSubtype 为“x420”,即kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange10 位格式。时刻-[AVCaptureDevice activeFormat]设置为这 4 种格式之一,AVCaptureVideoDataOutput 将样本缓冲区格式更改为 kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange!AVCaptureDevice 的 Active Color Space 也更改为 AVCaptureColorSpace_HLG_BT2020。

于 2020-11-26T15:25:52.600 回答
2

11 月 26 日更新。

正如@Deepak 在他自己的回答和评论中发布的那样,“x420”标记格式将使相机在 HLG 模式下工作。IP12 Pro 中所有可用的启用 HLG 的格式都在下面更新。

原始答案

对于 iOS 14.2,我可以从 AVCaptureDevice 实例中转储所有可用格式,似乎日志输出可以很好地解释自己。如下所述,希望将 AVCaptureDevice.activeFormat 设置为 HDR+wide 颜色格式之一将完成这项工作。

<AVCaptureDeviceFormat: 0x282d8daf0 'vide'/'x420' 1280x 720, { 1- 30 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @2.91), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth>
<AVCaptureDeviceFormat: 0x282d8dac0 'vide'/'x420' 1280x 720, { 1- 60 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @2.91), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth>
<AVCaptureDeviceFormat: 0x282d8da50 'vide'/'x420' 1920x1080, { 1- 30 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @1.94), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth, supports multicam>
<AVCaptureDeviceFormat: 0x282d8da30 'vide'/'x420' 1920x1080, { 1- 60 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @1.94), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports multicam>
<AVCaptureDeviceFormat: 0x282d8d9e0 'vide'/'x420' 1920x1440, { 1- 30 fps}, HRSI:4032x3024, fov:67.096, max zoom:189.00 (upscales @2.10), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth, supports multicam>
<AVCaptureDeviceFormat: 0x282d8d950 'vide'/'x420' 3840x2160, { 1- 30 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:125.25 (upscales @1.00), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports multicam>

截至 11 月 23 日,这仍然是一个正在进行的调查,我认为需要一些共同努力,或者苹果工程师可以看看这个。

我相信我已经观看了关于这个主题的所有可用的 WWDC17/18/19/20 会议,并且随着新 iPhone 12 的发布,这里有一些发现。

从相机捕获 HDR 并直接保存为 10 位 HLG 视频仅适用于 iPhone 12 及更新版本。这就是它在产品发布中所声称的,并且我从朋友的新手机中获得了示例视频,它按预期工作。

在 WWDC2020 中,使用 AVFoundation 在您的应用中导出 HDR 媒体,它声称:

在这一点上,我想简要介绍一下哪些 Apple 平台可以支持 HDR 导出。

iOS 支持在配备 Apple A10 Fusion 芯片或更新版本的设备上进行 HEVC 硬件编码。

幸运的是,A10 设备已经存在了一段时间,可以追溯到 iPhone 7、2018 年发布的 iPad 和 2019 年的 iPod touch。

对于 Mac,HEVC 和 Apple ProRes 软件编码器均适用于所有 Mac。

HEVC 硬件编码在 2017 年和运行新 macOS 的更新的 Mac 上普遍可用。

硬件编码将使导出速度显着加快。

同样在该视频中,它声称HDR 导出仅适用于 10bit HEVC 编码,因此 A10+ SoC 应该具有 10bit HEVC 编码能力。这是一个猜测,我可以在 iPhone 11 和 SE2 上的官方 Photo 应用程序中编辑 iPhone12 HLG 视频,并且写入性能(4k@60p,HLG)相当不错,这是一个很好的指标。但是,我没有运气在代码中完成这项工作,视频中列出的示例代码不可能是全貌,而且我还很难找到一个工作演示。理论上,旧设备也应该能够记录 10 位 HLG,或者相机,热/功率预算是这里的限制。

但是,其中唯一相关的 HDR 键是 VideoProfileLevelKey,在使用 HEVC 编解码器导出 HDR 时必须将其设置为 HEVC_Main10_AutoLevel。

请注意,不支持 8 位 HEVC HDR,并且此密钥不适用于 ProRes 导出。

好的,现在让我们花点时间总结一下在输出到两种常见的 HDR 格式时如何配置我刚才讨论的键:HLG 和 HDR10。此表显示了用于导出 HLG 文件的相关 HDR 设置。

另一个值得反复观看的视频:使用 AVFoundation 编辑和播放 HDR 视频

在测试期间,我确实得到了一个启用了 HDR 的 CVPixelBuffer(format:kCVPixelFormatType_420YpCbCr10BiPlanarFullRange),并且从示例 HLG 视频中正确管理了颜色。这是来自我的控制台日志的转储,它适用于任何支持 iOS 14 的设备,即使是相当旧的 iPhone6s(A9),因为它在这里只涉及 10 位 HEVC 解码。

_displayLinkDidRefresh():121 - Optional(<CVPixelBuffer 0x281300500 width=3840 height=2160 pixelFormat=xf20 iosurface=0x282008050 planes=2 poolName=450:decode_1>
<Plane 0 width=3840 height=2160 bytesPerRow=7680>
<Plane 1 width=1920 height=1080 bytesPerRow=7680>
<attributes={
    PixelFormatDescription =     {
        BitsPerComponent = 10;
        CGBitmapContextCompatibility = 0;
        CGImageCompatibility = 0;
        ComponentRange = FullRange;
        ContainsAlpha = 0;
        ContainsGrayscale = 0;
        ContainsRGB = 0;
        ContainsYCbCr = 1;
        FillExtendedPixelsCallback = {length = 24, bytes = 0x0000000000000000b48ab8a1010000000000000000000000};
        IOSurfaceCoreAnimationCompatibility = 1;
        IOSurfaceCoreAnimationCompatibilityHTPCOK = 1;
        IOSurfaceOpenGLESTextureCompatibility = 1;
        OpenGLESCompatibility = 1;
        PixelFormat = 2019963440;
        Planes =         (
                        {
                BitsPerBlock = 16;
                HorizontalSubsampling = 1;
                VerticalSubsampling = 1;
            },
                        {
                BitsPerBlock = 32;
                BlackBlock = {length = 4, bytes = 0x00800080};
                HorizontalSubsampling = 2;
                VerticalSubsampling = 2;
            }
        );
    };
} propagatedAttachments={
    CVFieldCount = 1;
    CVImageBufferChromaLocationBottomField = Left;
    CVImageBufferChromaLocationTopField = Left;
    CVImageBufferColorPrimaries = "ITU_R_2020";
    CVImageBufferTransferFunction = "ITU_R_2100_HLG";
    CVImageBufferYCbCrMatrix = "ITU_R_2020";
    QTMovieTime =     {
        TimeScale = 600;
        TimeValue = 12090;
    };
} nonPropagatedAttachments={
}>)
于 2020-11-24T05:09:22.187 回答
0

没有设备就很难说,但我会假设AVCaptureDeviceiPhone 12 中的(部分)s 将支持format支持 HDR 传输的 s ( isVideoHDRSupported)。

相应AVCaptureVideoDataOutput的 'savailableVideoPixelFormatTypes可能会列出kCVPixelFormatType_420YpCbCr10BiPlanarFullRange和类似类型作为选项。

于 2020-10-15T06:37:01.037 回答
0

如果有人正在寻找有关此的更多信息,我在 iOS AVFoundation 中找到了以下列表,请注意它声明它不支持 CoreVideo 中的所有这些。

/*
CoreVideo pixel format type constants.
CoreVideo does not provide support for all of these formats; this list just defines their names.
*/

public var kCVPixelFormatType_1Monochrome: OSType { get } /* 1 bit indexed */
public var kCVPixelFormatType_2Indexed: OSType { get } /* 2 bit indexed */
public var kCVPixelFormatType_4Indexed: OSType { get } /* 4 bit indexed */
public var kCVPixelFormatType_8Indexed: OSType { get } /* 8 bit indexed */
public var kCVPixelFormatType_1IndexedGray_WhiteIsZero: OSType { get } /* 1 bit indexed gray, white is zero */
public var kCVPixelFormatType_2IndexedGray_WhiteIsZero: OSType { get } /* 2 bit indexed gray, white is zero */
public var kCVPixelFormatType_4IndexedGray_WhiteIsZero: OSType { get } /* 4 bit indexed gray, white is zero */
public var kCVPixelFormatType_8IndexedGray_WhiteIsZero: OSType { get } /* 8 bit indexed gray, white is zero */
public var kCVPixelFormatType_16BE555: OSType { get } /* 16 bit BE RGB 555 */
public var kCVPixelFormatType_16LE555: OSType { get } /* 16 bit LE RGB 555 */
public var kCVPixelFormatType_16LE5551: OSType { get } /* 16 bit LE RGB 5551 */
public var kCVPixelFormatType_16BE565: OSType { get } /* 16 bit BE RGB 565 */
public var kCVPixelFormatType_16LE565: OSType { get } /* 16 bit LE RGB 565 */
public var kCVPixelFormatType_24RGB: OSType { get } /* 24 bit RGB */
public var kCVPixelFormatType_24BGR: OSType { get } /* 24 bit BGR */
public var kCVPixelFormatType_32ARGB: OSType { get } /* 32 bit ARGB */
public var kCVPixelFormatType_32BGRA: OSType { get } /* 32 bit BGRA */
public var kCVPixelFormatType_32ABGR: OSType { get } /* 32 bit ABGR */
public var kCVPixelFormatType_32RGBA: OSType { get } /* 32 bit RGBA */
public var kCVPixelFormatType_64ARGB: OSType { get } /* 64 bit ARGB, 16-bit big-endian samples */
public var kCVPixelFormatType_64RGBALE: OSType { get } /* 64 bit RGBA, 16-bit little-endian full-range (0-65535) samples */
public var kCVPixelFormatType_48RGB: OSType { get } /* 48 bit RGB, 16-bit big-endian samples */
public var kCVPixelFormatType_32AlphaGray: OSType { get } /* 32 bit AlphaGray, 16-bit big-endian samples, black is zero */
public var kCVPixelFormatType_16Gray: OSType { get } /* 16 bit Grayscale, 16-bit big-endian samples, black is zero */
public var kCVPixelFormatType_30RGB: OSType { get } /* 30 bit RGB, 10-bit big-endian samples, 2 unused padding bits (at least significant end). */
public var kCVPixelFormatType_422YpCbCr8: OSType { get } /* Component Y'CbCr 8-bit 4:2:2, ordered Cb Y'0 Cr Y'1 */
public var kCVPixelFormatType_4444YpCbCrA8: OSType { get } /* Component Y'CbCrA 8-bit 4:4:4:4, ordered Cb Y' Cr A */
public var kCVPixelFormatType_4444YpCbCrA8R: OSType { get } /* Component Y'CbCrA 8-bit 4:4:4:4, rendering format. full range alpha, zero biased YUV, ordered A Y' Cb Cr */
public var kCVPixelFormatType_4444AYpCbCr8: OSType { get } /* Component Y'CbCrA 8-bit 4:4:4:4, ordered A Y' Cb Cr, full range alpha, video range Y'CbCr. */
public var kCVPixelFormatType_4444AYpCbCr16: OSType { get } /* Component Y'CbCrA 16-bit 4:4:4:4, ordered A Y' Cb Cr, full range alpha, video range Y'CbCr, 16-bit little-endian samples. */
public var kCVPixelFormatType_444YpCbCr8: OSType { get } /* Component Y'CbCr 8-bit 4:4:4 */
public var kCVPixelFormatType_422YpCbCr16: OSType { get } /* Component Y'CbCr 10,12,14,16-bit 4:2:2 */
public var kCVPixelFormatType_422YpCbCr10: OSType { get } /* Component Y'CbCr 10-bit 4:2:2 */
public var kCVPixelFormatType_444YpCbCr10: OSType { get } /* Component Y'CbCr 10-bit 4:4:4 */
public var kCVPixelFormatType_420YpCbCr8Planar: OSType { get } /* Planar Component Y'CbCr 8-bit 4:2:0.  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrPlanar struct */
public var kCVPixelFormatType_420YpCbCr8PlanarFullRange: OSType { get } /* Planar Component Y'CbCr 8-bit 4:2:0, full range.  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrPlanar struct */
public var kCVPixelFormatType_422YpCbCr_4A_8BiPlanar: OSType { get } /* First plane: Video-range Component Y'CbCr 8-bit 4:2:2, ordered Cb Y'0 Cr Y'1; second plane: alpha 8-bit 0-255 */
public var kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_420YpCbCr8BiPlanarFullRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_422YpCbCr8BiPlanarVideoRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:2, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_422YpCbCr8BiPlanarFullRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:2, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_444YpCbCr8BiPlanarVideoRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:4:4, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_444YpCbCr8BiPlanarFullRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:4:4, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_422YpCbCr8_yuvs: OSType { get } /* Component Y'CbCr 8-bit 4:2:2, ordered Y'0 Cb Y'1 Cr */
public var kCVPixelFormatType_422YpCbCr8FullRange: OSType { get } /* Component Y'CbCr 8-bit 4:2:2, full range, ordered Y'0 Cb Y'1 Cr */
public var kCVPixelFormatType_OneComponent8: OSType { get } /* 8 bit one component, black is zero */
public var kCVPixelFormatType_TwoComponent8: OSType { get } /* 8 bit two component, black is zero */
public var kCVPixelFormatType_30RGBLEPackedWideGamut: OSType { get } /* little-endian RGB101010, 2 MSB are zero, wide-gamut (384-895) */
public var kCVPixelFormatType_ARGB2101010LEPacked: OSType { get } /* little-endian ARGB2101010 full-range ARGB */
public var kCVPixelFormatType_OneComponent10: OSType { get } /* 10 bit little-endian one component, stored as 10 MSBs of 16 bits, black is zero */
public var kCVPixelFormatType_OneComponent12: OSType { get } /* 12 bit little-endian one component, stored as 12 MSBs of 16 bits, black is zero */
public var kCVPixelFormatType_OneComponent16: OSType { get } /* 16 bit little-endian one component, black is zero */
public var kCVPixelFormatType_TwoComponent16: OSType { get } /* 16 bit little-endian two component, black is zero */
public var kCVPixelFormatType_OneComponent16Half: OSType { get } /* 16 bit one component IEEE half-precision float, 16-bit little-endian samples */
public var kCVPixelFormatType_OneComponent32Float: OSType { get } /* 32 bit one component IEEE float, 32-bit little-endian samples */
public var kCVPixelFormatType_TwoComponent16Half: OSType { get } /* 16 bit two component IEEE half-precision float, 16-bit little-endian samples */
public var kCVPixelFormatType_TwoComponent32Float: OSType { get } /* 32 bit two component IEEE float, 32-bit little-endian samples */
public var kCVPixelFormatType_64RGBAHalf: OSType { get } /* 64 bit RGBA IEEE half-precision float, 16-bit little-endian samples */
public var kCVPixelFormatType_128RGBAFloat: OSType { get } /* 128 bit RGBA IEEE float, 32-bit little-endian samples */
public var kCVPixelFormatType_14Bayer_GRBG: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered G R G R... alternating with B G B G... */
public var kCVPixelFormatType_14Bayer_RGGB: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered R G R G... alternating with G B G B... */
public var kCVPixelFormatType_14Bayer_BGGR: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered B G B G... alternating with G R G R... */
public var kCVPixelFormatType_14Bayer_GBRG: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered G B G B... alternating with R G R G... */
public var kCVPixelFormatType_DisparityFloat16: OSType { get } /* IEEE754-2008 binary16 (half float), describing the normalized shift when comparing two images. Units are 1/meters: ( pixelShift / (pixelFocalLength * baselineInMeters) ) */
public var kCVPixelFormatType_DisparityFloat32: OSType { get } /* IEEE754-2008 binary32 float, describing the normalized shift when comparing two images. Units are 1/meters: ( pixelShift / (pixelFocalLength * baselineInMeters) ) */
public var kCVPixelFormatType_DepthFloat16: OSType { get } /* IEEE754-2008 binary16 (half float), describing the depth (distance to an object) in meters */
public var kCVPixelFormatType_DepthFloat32: OSType { get } /* IEEE754-2008 binary32 float, describing the depth (distance to an object) in meters */
public var kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange: OSType { get } /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
public var kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange: OSType { get } /* 2 plane YCbCr10 4:2:2, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
public var kCVPixelFormatType_444YpCbCr10BiPlanarVideoRange: OSType { get } /* 2 plane YCbCr10 4:4:4, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
public var kCVPixelFormatType_420YpCbCr10BiPlanarFullRange: OSType { get } /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, full-range (Y range 0-1023) */
public var kCVPixelFormatType_422YpCbCr10BiPlanarFullRange: OSType { get } /* 2 plane YCbCr10 4:2:2, each 10 bits in the MSBs of 16bits, full-range (Y range 0-1023) */
public var kCVPixelFormatType_444YpCbCr10BiPlanarFullRange: OSType { get } /* 2 plane YCbCr10 4:4:4, each 10 bits in the MSBs of 16bits, full-range (Y range 0-1023) */
public var kCVPixelFormatType_420YpCbCr8VideoRange_8A_TriPlanar: OSType { get } /* first and second planes as per 420YpCbCr8BiPlanarVideoRange (420v), alpha 8 bits in third plane full-range.  No CVPlanarPixelBufferInfo struct. */
public var kCVPixelFormatType_16VersatileBayer: OSType { get } /* Single plane Bayer 16-bit little-endian sensor element ("sensel") samples from full-size decoding of ProRes RAW images; Bayer pattern (sensel ordering) and other raw conversion information is described via buffer attachments */
public var kCVPixelFormatType_64RGBA_DownscaledProResRAW: OSType { get } /* Single plane 64-bit RGBA (16-bit little-endian samples) from downscaled decoding of ProRes RAW images; components--which may not be co-sited with one another--are sensel values and require raw conversion, information for which is described via buffer attachments */
于 2020-12-10T09:49:06.573 回答
0

使用 BT2020 强制格式是确保您在杜比视界拍摄的正确方法。执行此操作需要 iOS 14.1 或更高版本。这是我如何执行此操作的简短片段:

// Setup your session
session.beginConfiguration()
session.sessionPreset = .hd1280x720

// Add your camera to the session
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)
let cameraInput = try AVCaptureDeviceInput(device: camera)
session.addInput(cameraInput)

// Important! Commit the session configuration before configuring your camera
session.commitConfiguration()

// Configure camera
try camera.lockForConfiguration()

// Force HDR on
camera.automaticallyAdjustsVideoHDREnabled = false
camera.isVideoHDREnabled = true

// Find the first 720p format that supports the correct colorspace
let desiredColorSpace = AVCaptureColorSpace.HLG_BT2020
let desiredFormat = camera.formats.first { format in
    // You could of course choose a different resolution if desired
    format.formatDescription.dimensions == CMVideoDimensions(width: 1280, height: 720) &&
        format.supportedColorSpaces.contains(desiredColorSpace)
}

// Set the HDR format
if let format = desiredFormat {
    camera.activeFormat = format
    camera.activeColorSpace = desiredColorSpace
} else {
    assertionFailure("Counldn't find HDR camera format")
}

camera.unlockForConfiguration()
于 2021-06-21T21:47:20.043 回答