我已经实现了一个RPScreenRecorder
,它记录屏幕和麦克风音频。完成多个录制后,我停止录制并将音频与视频AVMutableComposition
合并,然后合并所有视频以形成单个视频。
对于屏幕录制和获取视频和音频文件,我正在使用
- (void)startCaptureWithHandler:(nullable void(^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error))captureHandler completionHandler:
用于停止录制。我调用这个函数:
- (void)stopCaptureWithHandler:(void (^)(NSError *error))handler;
这些都很直截了当。
大多数时候效果很好,我会收到视频和音频 CMSampleBuffers。但有时它startCaptureWithHandler
只会发送给我音频缓冲区而不是视频缓冲区。
一旦我遇到这个问题,直到我重新启动设备并重新安装应用程序才会解决。这使得我的应用程序对用户来说非常不可靠。我认为这是一个重播工具包问题,但无法与其他开发人员发现相关问题。让我知道你们中是否有人遇到这个问题并得到了解决方案。
我检查了多次,但没有发现配置有任何问题。但无论如何,它就在这里。
NSError *videoWriterError;
videoWriter = [[AVAssetWriter alloc] initWithURL:fileString fileType:AVFileTypeQuickTimeMovie
error:&videoWriterError];
NSError *audioWriterError;
audioWriter = [[AVAssetWriter alloc] initWithURL:audioFileString fileType:AVFileTypeAppleM4A
error:&audioWriterError];
CGFloat width =UIScreen.mainScreen.bounds.size.width;
NSString *widthString = [NSString stringWithFormat:@"%f", width];
CGFloat height =UIScreen.mainScreen.boNSString *heightString = [NSString stringWithFormat:@"%f", height];unds.size.height;
NSDictionary * videoOutputSettings= @{AVVideoCodecKey : AVVideoCodecTypeH264,
AVVideoWidthKey: widthString,
AVVideoHeightKey : heightString};
videoInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoOutputSettings];
videoInput.expectsMediaDataInRealTime = true;
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary * audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil ];
audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
[audioInput setExpectsMediaDataInRealTime:YES];
[videoWriter addInput:videoInput];
[audioWriter addInput:audioInput];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:nil];
[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {
Block
}
startCaptureWithHandler 函数也具有非常简单的功能:
[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {
dispatch_sync(dispatch_get_main_queue(), ^{
if(CMSampleBufferDataIsReady(sampleBuffer))
{
if (self->videoWriter.status == AVAssetWriterStatusUnknown)
{
self->writingStarted = true;
[self->videoWriter startWriting];
[self->videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
[self->audioWriter startWriting];
[self->audioWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
}
if (self->videoWriter.status == AVAssetWriterStatusFailed) {
return;
}
if (bufferType == RPSampleBufferTypeVideo)
{
if (self->videoInput.isReadyForMoreMediaData)
{
[self->videoInput appendSampleBuffer:sampleBuffer];
}
}
else if (bufferType == RPSampleBufferTypeAudioMic)
{
// printf("\n+++ bufferAudio received %d \n",arc4random_uniform(100));
if (writingStarted){
if (self->audioInput.isReadyForMoreMediaData)
{
[self->audioInput appendSampleBuffer:sampleBuffer];
}
}
}
}
});
}
此外,当这种情况发生时,系统屏幕录像机也会损坏。单击系统记录器时,会显示此错误:
错误提示“屏幕录制已停止,原因是:由于 Mediaservices 错误导致录制失败”。
应该有两个原因:
- iOS Replay 套件处于测试阶段,这就是为什么它在使用后有时会出现问题。
- 我已经实现了任何有问题的逻辑,这会导致 replaykit 崩溃。
如果是问题没有。1,那就没问题了。如果这是问题号。2 那么我必须知道我可能错在哪里?
意见和帮助将不胜感激。