我想要播放视频(来自本地文件和远程 URL)及其音轨,并检索视频每一帧的像素缓冲区以将其绘制到 OpenGL 纹理。
这是我在 iOS 6 中使用的代码(它工作正常):
启动视频
- (void) readMovie:(NSURL *)url {
NSLog(@"Playing video %@", param.url);
AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded) {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:output];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
[self setPlayer:player];
[self setPlayerItem:playerItem];
[self setOutput:output];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(bufferingVideo:) name:AVPlayerItemPlaybackStalledNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoEnded:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoFailed:) name:AVPlayerItemFailedToPlayToEndTimeNotification object:nil];
[[self player] addObserver:self forKeyPath:@"rate" options:0 context:nil];
[[self player] addObserver:self forKeyPath:@"status" options:0 context:NULL];
[player play];
} else {
NSLog(@"%@ Failed to load the tracks.", self);
}
});
}];
}
读取视频缓冲区(在称为每一帧的更新函数中)
- (void) readNextMovieFrame {
CMTime outputItemTime = [[self playerItem] currentTime];
float interval = [self maxTimeLoaded];
CMTime t = [[self playerItem] currentTime];
CMTime d = [[self playerItem] duration];
NSLog(@"Video : %f/%f (loaded : %f) - speed : %f", (float)t.value / (float)t.timescale, (float)d.value / (float)d.timescale, interval, [self player].rate);
[videoBar updateProgress:(interval / CMTimeGetSeconds(d))];
[videoBar updateSlider:(CMTimeGetSeconds(t) / CMTimeGetSeconds(d))];
if ([[self output] hasNewPixelBufferForItemTime:outputItemTime]) {
CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:nil];
// Lock the image buffer
CVPixelBufferLockBaseAddress(buffer, 0);
// Get information of the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(buffer);
size_t width = CVPixelBufferGetWidth(buffer);
size_t height = CVPixelBufferGetHeight(buffer);
// Fill the texture
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, baseAddress);
// Unlock the image buffer
CVPixelBufferUnlockBaseAddress(buffer, 0);
//CFRelease(sampleBuffer);
CVBufferRelease(buffer);
}
}
所以这段代码在 iOS 6 上运行良好,我希望它在 iOS 5 上运行,但AVPlayerItemVideoOutput
不是 iOS 5 的一部分,所以我仍然可以播放视频,但我不知道如何检索每一帧的像素缓冲区视频。
您知道我可以使用什么来代替AVPlayerItemVideoOutput
检索视频每一帧的像素缓冲区吗?(它必须适用于本地和远程视频,我也想播放音轨)。
非常感谢您的帮助 !