0

是否可以创建一个具有两个 urlAVAsset的对象,一个用于音频,另一个用于视频轨道?

我已经尝试过,AVMutableComposition但它似乎首先加载整个内容并在开始视频+音频播放之前将其缓冲到某处。在 的文档中AVComposition,它说可以组合基于文件的资产,但我需要一种组合基于 url 的资产的方法。

或者是否可以设置一个选项AVComposition以便在加载整个内容之前开始播放?

编辑

这就是我尝试的方式:

NSDictionary *urlAssetOptions = @{AVURLAssetPreferPreciseDurationAndTimingKey: [NSNumber numberWithBool:NO]};

AVMutableComposition *composition = [AVMutableComposition composition];


NSURL *audioUrl = [NSURL URLWithString:@"http://..."];
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioUrl options:urlAssetOptions];

AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];


NSURL *videoUrl = [NSURL URLWithString:@"http://..."];
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoUrl options:urlAssetOptions];

AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];


AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
4

1 回答 1

1

您使用的解决方案不需要加载整个内容来开始创建可变组合,也不需要开始播放您创建的组合。然而,它需要加载一部分媒体文件来确定每个文件的持续时间和轨道。

下面是使用 url 到 google 找到的 mp3 和 mp4 文件的工作代码来创建可变组合并将其传递给 AVPlayerViewController。如果您运行代码,您会注意到它开始播放的速度非常快,但如果您跳过视频时间线,您会发现在请求的时间内加载数据需要很长时间。

NSURL *audioURL = [NSURL URLWithString:@"http://www.mfiles.co.uk/mp3-downloads/Toccata-and-Fugue-Dm.mp3"];
AVAsset *audioAsset = [AVAsset assetWithURL:audioURL];

NSURL *videoURL = [NSURL URLWithString:@"http://thv1.uloz.to/6/c/4/6c4b50308843dd29c9176cc2c4961155.360.mp4?fileId=20389770"];
AVAsset *videoAsset = [AVAsset assetWithURL:videoURL];

CMTime duration;
if (CMTimeGetSeconds(audioAsset.duration) < CMTimeGetSeconds(videoAsset.duration)) {
    duration = audioAsset.duration;
} else {
    duration = videoAsset.duration;
}

NSError *error;

AVMutableComposition* mixAsset = [[AVMutableComposition alloc] init];

AVMutableCompositionTrack* audioTrack = [mixAsset addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error: &error];

AVMutableCompositionTrack* videoTrack = [mixAsset addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error: &error];

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:mixAsset];

AVPlayerViewController* playerController = [AVPlayerViewController new];
playerController.player = [AVPlayer playerWithPlayerItem:playerItem];

[self presentViewController:playerController animated:YES completion:nil];
于 2016-10-20T19:41:10.753 回答