我正在尝试使用 AVFoundation 在我的应用程序中实现视频捕获。我在 viewDidLoad 下有以下代码:
session = [[AVCaptureSession alloc] init];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
videoInputDevice = [[AVCaptureDeviceInput alloc] init];
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];
if (videoDevice)
{
NSError *error;
videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error)
{
if ([session canAddInput:videoInputDevice])
[session addInput:videoInputDevice];
else
NSLog (@"Couldn't add input.");
}
}
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *audioError = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&audioError];
if (audioInput)
{
[session addInput:audioInput];
}
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
Float64 TotalSeconds = 35; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);
movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;
if ([session canAddOutput:movieFileOutput])
[session addOutput:movieFileOutput];
[session setSessionPreset:AVCaptureSessionPresetMedium];
if ([session canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[session setSessionPreset:AVCaptureSessionPreset640x480];
[self cameraSetOutputProperties];
[session startRunning];
此代码在用于启动捕获的按钮的实现中,除其他外:
NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
}
}
[outputPath release];
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];
当我尝试在设备上运行它时,当我尝试加载所有这些都应该发生的视图时它会崩溃。Xcode 给了我一个“线程 1:EXC_BAD_ACCESS (code=1, address=0x4) 在:
AVFoundation`-[AVCaptureDeviceInput _setDevice:]:
(stuff)
0x3793f608: ldr r0, [r1, r0]
错误在最后一行给出。我认为这与某处的 AVCaptureDeviceInput 有关,但我对它可能是什么一无所知。有谁知道我在这里缺少什么?谢谢。
编辑:摆弄断点后,我发现崩溃发生在这一行:
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];
那么与该方法有关吗?这是我的实现文件,也许那里有问题。
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
编辑2:可能是我正在使用
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];
并且“自我”以某种方式向其中扔了扳手?我知道在创建 CALayer 时可以这样做
CALayer *aLayer = [CALayer layer];
但我不知道与此等效的 AVCaptureDevice(如果有的话)。我不确定它还能是什么,总而言之,我的代码看起来不错,我尝试过清理项目、重新启动 Xcode、重新启动计算机等。