问题:
在每次播放期间,音频都在视频后面 1-2 秒之间。
设置:
资产使用来自媒体流的 AVURLAssets 加载。
为了编写作品,我使用了具有不对称时间尺度的 AVMutableCompositions 和 AVMutableCompositionTracks。音频和视频都流式传输到设备。音频的时间刻度是 44100;视频的时间刻度是 600。
播放是用 AVPlayer 完成的。
尝试的解决方案:
- 用于.
videoAssetTrack.timeRange
_[composition insertTimeRange]
- 使用
CMTimeRangeMake(kCMTimeZero, videoAssetTrack.duration);
- 使用
CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration);
编码:
+(AVMutableComposition*)overlayAudio:(AVURLAsset*)audioAsset
withVideo:(AVURLAsset*)videoAsset
{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVAssetTrack* audioTrack = [self getTrackFromAsset:audioAsset withMediaType:AVMediaTypeAudio];
AVAssetTrack* videoTrack = [self getTrackFromAsset:videoAsset withMediaType:AVMediaTypeVideo];
CMTime duration = videoTrack.timeRange.duration;
AVMutableCompositionTrack* audioComposition = [self composeTrack:audioTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeAudio];
AVMutableCompositionTrack* videoComposition = [self composeTrack:videoTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeVideo];
[self makeAssertionAgainstAudio:audioComposition andVideo:videoComposition];
return mixComposition;
}
+(AVAssetTrack*)getTrackFromAsset:(AVURLAsset*)asset withMediaType:(NSString*)mediaType
{
return [[asset tracksWithMediaType:mediaType] objectAtIndex:0];
}
+(AVAssetExportSession*)configureExportSessionWithAsset:(AVMutableComposition*)composition toUrl:(NSURL*)url
{
AVAssetExportSession* exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputFileType = @"com.apple.quicktime-movie";
exportSession.outputURL = url;
exportSession.shouldOptimizeForNetworkUse = YES;
return exportSession;
}
-(IBAction)playVideo
{
[avPlayer pause];
avPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
avPlayer = [[AVPlayer alloc]initWithPlayerItem:avPlayerItem];
avPlayerLayer =[AVPlayerLayer playerLayerWithPlayer:avPlayer];
[avPlayerLayer setFrame:CGRectMake(0, 0, 305, 283)];
[avPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[playerView.layer addSublayer:avPlayerLayer];
[avPlayer seekToTime:kCMTimeZero];
[avPlayer play];
}
注释:
我不太了解 AVFoundation 框架。我完全有可能只是滥用了我提供的片段。(即为什么“insertTimeRange”用于组合?)
我可以提供解析所需的任何其他信息——包括调试资产跟踪属性值、网络遥测、流信息等。