4

我遇到的问题是,当我使用 iOS9 sdk 编译我的应用程序时,当我的应用程序尝试CVPixelBufferRef从一个函数中获取一个时,当加载视频并创建所有实例时AVPlayerItemVideoOutput- copyPixelBufferForItemTime:itemTimeForDisplay:我不时得到一个空值。

使用 iOS 8,我的应用程序运行良好,但使用 iOS9 时出现问题,即使应用商店中可供下载且使用 iOS 8 SDK 编译的应用程序版本在安装时也会出现同样的问题IOS9。

当问题发生并且我得到一个 nullgetCVPixelBufferRef时,如果我按下主页按钮并且应用程序在我再次打开应用程序并变为活动状态时进入后台,AVPlayerItemVideoOutput那么给我 null 的实例CVPixelBufferRef开始正常工作并且问题得到解决。

这是一个 youtube 视频,我在其中复制了该问题:

https://www.youtube.com/watch?v=997zG08_DMM&feature=youtu.be

以下是创建所有项目实例的示例代码:

NSURL *url ;
url = [[NSURL alloc] initFileURLWithPath:[_mainVideo objectForKey:@"file"]];

NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
_videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
_myVideoOutputQueue = dispatch_queue_create("myVideoOutputQueue", DISPATCH_QUEUE_SERIAL);
[_videoOutput setDelegate:self queue:_myVideoOutputQueue];

_player = [[AVPlayer alloc] init];


// Do not take mute button into account
NSError *error = nil;
BOOL success = [[AVAudioSession sharedInstance]
                setCategory:AVAudioSessionCategoryPlayback
                error:&error];
if (!success) {
   // NSLog(@"Could not use AVAudioSessionCategoryPlayback", nil);
}

asset = [AVURLAsset URLAssetWithURL:url options:nil];


if(![[NSFileManager defaultManager] fileExistsAtPath:[[asset URL] path]]) {
   // NSLog(@"file does not exist");
}

NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];

[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:^{

    dispatch_async( dispatch_get_main_queue(),
                   ^{
                       /* Make sure that the value of each key has loaded successfully. */
                       for (NSString *thisKey in requestedKeys)
                       {
                           NSError *error = nil;
                           AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
                           if (keyStatus == AVKeyValueStatusFailed)
                           {
                               [self assetFailedToPrepareForPlayback:error];
                               return;
                           }
                       }

                       NSError* error = nil;
                       AVKeyValueStatus status = [asset statusOfValueForKey:kTracksKey error:&error];
                       if (status == AVKeyValueStatusLoaded)
                       {
                           //_playerItem = [AVPlayerItem playerItemWithAsset:asset];


                           [_playerItem addOutput:_videoOutput];
                           [_player replaceCurrentItemWithPlayerItem:_playerItem];
                           [_videoOutput requestNotificationOfMediaDataChangeWithAdvanceInterval:ONE_FRAME_DURATION];

                           /* When the player item has played to its end time we'll toggle
                            the movie controller Pause button to be the Play button */
                           [[NSNotificationCenter defaultCenter] addObserver:self
                                                                    selector:@selector(playerItemDidReachEnd:)
                                                                        name:AVPlayerItemDidPlayToEndTimeNotification
                                                                      object:_playerItem];

                           seekToZeroBeforePlay = NO;

                           [_playerItem addObserver:self
                                         forKeyPath:kStatusKey
                                            options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                            context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];

                           [_player addObserver:self
                                     forKeyPath:kCurrentItemKey
                                        options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                        context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];

                           [_player addObserver:self
                                     forKeyPath:kRateKey
                                        options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                        context:AVPlayerDemoPlaybackViewControllerRateObservationContext];


                           [self initScrubberTimer];

                           [self syncScrubber];


                       }
                       else
                       {
                         //  NSLog(@"%@ Failed to load the tracks.", self);
                       }
                   });
}];

这是给我空像素缓冲区的示例代码

CVPixelBufferRef pixelBuffer =
[_videoOutput
 copyPixelBufferForItemTime:[_playerItem currentTime]
itemTimeForDisplay:nil];

NSLog(@"the pixel buffer is %@", pixelBuffer);
NSLog (@"the _videoOutput is %@", _videoOutput.description);
CMTime dataTime = [_playerItem currentTime];
//NSLog(@"the current time is %f", dataTime);
return pixelBuffer;
4

4 回答 4

3

我有同样的问题,并在这个线程中找到了答案:https ://forums.developer.apple.com/thread/27589#128476

在添加输出之前,您必须等待视频准备好播放,否则它将失败并返回 nil。我的快速代码如下所示:

func retrievePixelBufferToDraw() -> CVPixelBuffer? {
  guard let videoItem = player.currentItem else { return nil }
  if videoOutput == nil || self.videoItem !== videoItem {
    videoItem.outputs.flatMap({ return $0 as? AVPlayerItemVideoOutput }).forEach {
      videoItem.remove($0)
    }
    if videoItem.status != AVPlayerItemStatus.readyToPlay {
      // see https://forums.developer.apple.com/thread/27589#128476
      return nil
    }

    let pixelBuffAttributes = [
      kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
      ] as [String: Any]

    let videoOutput = AVPlayerItemVideoOutput.init(pixelBufferAttributes: pixelBuffAttributes)
    videoItem.add(videoOutput)
    self.videoOutput = videoOutput
    self.videoItem = videoItem
  }
  guard let videoOutput = videoOutput else { return nil }

  let time = videoItem.currentTime()
  if !videoOutput.hasNewPixelBuffer(forItemTime: time) { return nil }
  return videoOutput.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil)
}
于 2017-02-17T04:38:20.060 回答
1

虽然这不是问题的真正解决方案——根据你的评论,这似乎是 AVFoundation 中的一个错误——我发现比等待 1 秒更好的解决方法是,如果它无法交付,则重新创建 AVPlayer像素缓冲区。根据您实际的重新启动例程所做的事情,这可能会明显更快,并且对用户的刺激性更小。此外,仅当 AVPlayer 确实有问题时,它才会引入(轻微的)延迟,而不是每次启动它时。

但是,在播放器完成播放后,AVPlayerItemVideoOutput 将不再提供任何像素缓冲区。因此,您可能应该通过记住您是否已经收到任何像素缓冲区来防止这种情况。否则,您的播放器将执行非自愿循环播放。

在类接口中:

@property (nonatomic) BOOL videoOutputHadPixelBuffer;

然后在您尝试复制像素缓冲区之前:

if (![self.videoOutput hasNewPixelBufferForItemTime:self.player.currentTime] && !self.videoOutputHadPixelBuffer)
{
    [self restartPlayer]; // call your custom restart routine where you create a new AVPlayer object
}

self.videoOutputHadPixelBuffer = YES; // guard against missing pixel buffers after playback finished
于 2015-11-27T17:30:11.797 回答
0

我今天遇到了类似的问题,发现它只发生在:运行 ios 9.0 或更高版本的 64 位设备上,而项目不是为arm64架构而构建的。

更改构建设置以构建arm64 架构为我解决了这个问题。

于 2015-10-27T09:21:52.533 回答
0

*既然您说它也对您有用,我决定将其发布为答案,而不是评论,以尽可能提高其知名度。

答: 我仍在寻找更优雅的方法。我发现 AVPlayerItemVideoOutput alloc 的分配是相对于你传递给它的格式设置而言的,但是它所花费的时间并不是绝对的。分配和加载/播放之间的强制等待时间为我修复了它。此外,我只创建 1 个 AVPlayerItemVideoOutput 并重用它,所以我只需要 1 个延迟。

还:

下面使用hasNewPixelBufferForItemTime是我制作的一个统一插件的一个小示例,它只是将像素缓冲区的内容上传到纹理。

//////////////////////
 if (g_TexturePointer)
{
    if([plug.playerOutput hasNewPixelBufferForItemTime:[plug.player currentTime]])
    {
        pbuffer = [plug.playerOutput copyPixelBufferForItemTime:plug.player.currentItem.currentTime itemTimeForDisplay:nil];
    } ... .. . (No need to show the rest.)

快乐编码!

于 2015-11-13T11:41:35.813 回答