5

AVFoundation 允许您将 2 个资产(2 个视频)“组合”为 2 个“轨道”,就像在 Final Cut Pro 中一样。

理论说我可以将 2 个视频叠加在一起,带有 alpha,并同时查看。

要么我做错了,要么某处有错误,因为以下测试代码虽然有点混乱,但清楚地表明我应该看到 2 个视频,而我只看到一个,如下所示:http: //lockerz.com /s/172403384 -- “蓝色”方块是 IMG_1388.m4v

无论出于何种原因,从未显示 IMG_1383.MOV。

NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];

// Track B
NSURL *urlVideo2 = [NSURL URLWithString:@"file://localhost/Users/me/Movies/Temp/IMG_1388.m4v"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];

AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];

// Track A
NSURL *urlVideo = [NSURL URLWithString:@"file://localhost/Users/me/Movies/Temp/IMG_1383.MOV"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];

AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
[from setOpacity:.5 atTime:kCMTimeZero];

// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition,  nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(480, 360);


// Export
NSURL *outputURL = [NSURL URLWithString:@"file://localhost/Users/me/Movies/Temp/export.MOV"];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[[composition copy] autorelease] presetName:AVAssetExportPresetHighestQuality];
[exportSession setOutputFileType:@"com.apple.quicktime-movie"];
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:nil];

// Player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];

你看错了吗?

此代码的“目标”是“记录”相机输入(视频 1)和 opengl 输出(视频 2)。我还尝试使用 Buffers 等“直接”“组合”它们,但我也没有成功 :( 事实证明 AVFoundation 没有我想象的那么简单。

4

2 回答 2

3

看起来不错,除了这部分:

AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];

您需要使用videoTrackvideoTrack2构建层指令,即添加到的轨道composition,而不是原始资产videoAssetTrackvideoAssetTrack2

此外,添加一个转换来旋转视频有点棘手(就像 AVFoundation 中的任何基础知识一样)。我刚刚注释掉了这条线,让它播放 2 个视频。

这是您的修改代码:

NSError *error = nil;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];

// Track B
NSURL *urlVideo2 = [[NSBundle mainBundle] URLForResource:@"b" withExtension:@"mov"];        
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];

AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
//[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];

// Track A
NSURL *urlVideo = [[NSBundle mainBundle] URLForResource:@"a" withExtension:@"mov"];        
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];

AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[from setOpacity:.5 atTime:kCMTimeZero];

// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition,  nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = composition.naturalSize; // CGSizeMake(480, 360);
于 2012-01-10T00:04:34.547 回答
0

我觉得你搞错了。

一个视频文件可能有多个数据流。例如,如果是带声音的视频,则文件将有 2 个流,即音频流和视频流。另一个例子是音频环绕视频文件,它可能包括 5 个或更多音频流和 1 个视频流。

与音频一样,大多数视频文件容器格式(mov、mp4 等)支持 1 个文件中的多个视频流,但实际上这并不意味着这些流彼此之间有任何关系,它们只是被存储在同一个文件容器上。例如,如果您将使用 QuickTime 打开此类文件,您将获得与此类文件上的视频流一样多的窗口。

无论如何,视频流不会以这种方式“混合”。您要实现的目标与视频流的信号处理有关,我真的建议您阅读更多有关它的信息。

如果您真的不需要将视频数据“混合”到一个文件中,您可能希望使用 MPMediaPlayers 将两个视频文件相互显示。请记住,处理视频数据通常是一个 CPU 密集型问题,您可能(有时)无法使用当今的 iOS 设备解决。

于 2012-01-07T16:47:32.290 回答