5

我目前正在渲染一个小于工作正常的输出大小的视频轨道。我想将 UIImage 绘制到背景中,以便视频位于顶部,并且图像显示在视频不存在的区域中。我尝试使用 CoreAnimation 图层和 videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:inLayer: 但视频图层下方的图层似乎没有显示出来(上面的显示很好) - 只是黑色或我在 AVMutableVideoCompositionInstruction 对象上设置的任何背景颜色。我也尝试将背景颜色设置为 [UIColor clearColor].CGColor 但它只是以黑色显示。

有人做过类似的事情并有建议吗?

CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
CALayer *backgroundLayer = [CALayer layer];
backgroundLayer.frame = rect;
parentLayer.frame = rect;
videoLayer.frame = rect;
videoLayer.backgroundColor = [UIColor clearColor].CGColor;
backgroundLayer.backgroundColor = [UIColor purpleColor].CGColor;
[parentLayer addSublayer:backgroundLayer];
[parentLayer addSublayer:videoLayer];

mainCompositionInst.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
4

1 回答 1

1

在尝试了几件事之后,我终于找到了一种工作方法。要使其正常工作,需要使用空白的视频/音频轨道。然后将背景图像叠加到这个空白视频层。然后将其导出并合并原始资产(视频)和导出的资产(资产)并导出最终资产(视频)。希望对您有所帮助。

添加叠加层

- (void)addOverlayImage:(UIImage *)overlayImage ToVideo:(AVMutableVideoComposition *)composition inSize:(CGSize)size {
    // 1 - set up the overlay
    CALayer *overlayLayer = [CALayer layer];

    [overlayLayer setContents:(id)[overlayImage CGImage]];
    overlayLayer.frame = CGRectMake(0, 0, size.width, size.height);
    [overlayLayer setMasksToBounds:YES];

    // 2 - set up the parent layer
    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
    videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
    [parentLayer addSublayer:videoLayer];
    [parentLayer addSublayer:overlayLayer];

    // 3 - apply magic
    composition.animationTool = [AVVideoCompositionCoreAnimationTool
                                 videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}

- (void)getBackgroundVideoAssetWithcompletion:(void (^)(AVAsset *bgAsset))completionBlock {

    NSString *path = [[NSBundle mainBundle] pathForResource:@"blank_video" ofType:@"mp4"];
    NSURL *trackUrl = [NSURL fileURLWithPath:path];
    AVAsset *asset = [AVAsset assetWithURL:trackUrl];
    AVAssetTrack *track = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    CMTimeRange range = CMTimeRangeMake(kCMTimeZero, [asset duration]);
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionVideoTrack insertTimeRange:range ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];


    CGAffineTransform videoTransform = track.preferredTransform;
    CGSize naturalSize = CGSizeApplyAffineTransform(track.naturalSize, videoTransform);
    naturalSize = CGSizeMake(fabs(naturalSize.width), fabs(naturalSize.height));


    AVMutableVideoComposition *composition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:asset];
    UIImage *img = [self imageWithImage:[UIImage imageNamed:@"white_image"] convertToSize:naturalSize];
    [self addOverlayImage:img ToVideo:composition inSize:naturalSize];


    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = range;
    composition.instructions = @[instruction];


    AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    _assetExport.videoComposition = composition;



    NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"exported-%d.mov", arc4random() % 100000]];
    unlink([exportPath UTF8String]);
    NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];

    _assetExport.outputFileType = AVFileTypeQuickTimeMovie;
    _assetExport.outputURL = exportUrl;
    _assetExport.shouldOptimizeForNetworkUse = YES;

    [_assetExport exportAsynchronouslyWithCompletionHandler:^{

        switch (_assetExport.status) {

            case AVAssetExportSessionStatusFailed:
                 break;

            case AVAssetExportSessionStatusExporting:
                break;

            case AVAssetExportSessionStatusCompleted:{

                dispatch_async(dispatch_get_main_queue(), ^{
                    NSLog(@"Successful!!!");
                    AVAsset *finalAsset = [AVAsset assetWithURL:_assetExport.outputURL];
                    completionBlock(finalAsset);
                });
            }
                break;

            default:
                break;
        }
    }];
}

现在有一个带有叠加图像的视频资产。唯一要做的就是将原始视频和导出的视频资产结合起来。导出的资产应该是底层,原始资产应该是顶层。

于 2017-12-17T07:46:49.620 回答