7

在我使用 AVAudioRecorder 和 AVAudioPlayer 录制和播放音频的应用程序中,我遇到了来电情况下的场景。在录制过程中,如果有电话来电,则只记录电话后录制的音频。我希望通话后录制的录音是通话前录制的音频的延续。

我使用 AVAudioRecorderDelegate 方法跟踪录音机中发生的中断

  • (void)audioRecorderBeginInterruption:(AVAudioRecorder *)avRecorder 和
  • (void)audioRecorderEndInterruption:(AVAudioRecorder *)avRecorder。

在我的 EndInterruption 方法中,我激活了 audioSession。

这是我使用的录制代码

- (void)startRecordingProcess
{
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    NSError *err = nil;
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err];
    if(err)
    {
        DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }
    [audioSession setActive:YES error:&err];
    err = nil;
    if(err)
    {
        DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }
    // Record settings for recording the audio
    recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
                     [NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,
                     [NSNumber numberWithInt:44100],AVSampleRateKey,
                     [NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
                     [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
                     nil];
    BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:recorderFilePath];
    if (fileExists) 
    {        
        BOOL appendingFileExists = 
            [[NSFileManager defaultManager] fileExistsAtPath:appendingFilePath];
        if (appendingFileExists)
        {
            [[NSFileManager defaultManager]removeItemAtPath:appendingFilePath error:nil];
        }
        if (appendingFilePath) 
        {
            [appendingFilePath release];
            appendingFilePath = nil;
        }
        appendingFilePath = [[NSString alloc]initWithFormat:@"%@/AppendedAudio.m4a", DOCUMENTS_FOLDER];
        fileUrl = [NSURL fileURLWithPath:appendingFilePath]; 
    }
    else 
    {
        isFirstTime = YES;
        if (recorderFilePath) 
        {
            DEBUG_LOG(@"Testing 2");
            [recorderFilePath release];
            recorderFilePath = nil;
        }
        DEBUG_LOG(@"Testing 3");
        recorderFilePath = [[NSString alloc]initWithFormat:@"%@/RecordedAudio.m4a", DOCUMENTS_FOLDER];
        fileUrl = [NSURL fileURLWithPath:recorderFilePath];
    }
    err = nil;
    recorder = [[recorder initWithURL:fileUrl settings:recordSetting error:&err]retain];
    if(!recorder)
    {
        DEBUG_LOG(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        [[AlertFunctions sharedInstance] showMessageWithTitle:kAppName 
                                                      message:[err localizedDescription] 
                                                     delegate:nil
                                            cancelButtonTitle:@"Ok"];
        return;
    }
    //prepare to record
    [recorder setDelegate:self];
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];

}

在寻找此问题的解决方案时,我遇到了另一个链接 ,如何在 iphone 发生中断后恢复录制?http://www.iphonedevsdk.com/forum/iphone-sdk-development/31268-avaudiorecorderdelegate-interruption.html谈到了同样的问题。我尝试了这些链接中给出的建议,但没有成功。我希望让它与 AVAudioRecorder 本身一起工作。有什么办法可以解决这个问题吗?感谢所有有价值的建议。

4

2 回答 2

5

经过几项研究后,Apple 通知我这是当前 API 的问题。因此,我设法通过在中断后保存以前的音频文件并将其与恢复的音频文件连接起来,找到了解决该问题的方法。希望它可以帮助那些可能面临同样问题的人。

于 2012-01-31T06:01:49.053 回答
3

我也面临着一个类似的问题,AVAudioRecorder即只有在中断后才录制。
所以我通过维护一系列录音并将它们保留在NSTemporaryDirectory最后合并它们来解决这个问题。

以下是关键步骤:

  1. 让你的班级听AVAudioSessionInterruptionNotification
  2. 在中断开始 ( AVAudioSessionInterruptionTypeBegan) 时,保存您的录音
  3. 在中断结束( AVAudioSessionInterruptionTypeEnded) 时,为中断选项开始新的录制AVAudioSessionInterruptionOptionShouldResume
  4. 在点击“保存”按钮 时附加所有录音。

上述步骤的代码片段是:

// 1. Make this class listen to the AVAudioSessionInterruptionNotification in viewDidLoad
- (void)viewDidLoad
{
    [super viewDidLoad];

    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(handleAudioSessionInterruption:)
                                                 name:AVAudioSessionInterruptionNotification
                                               object:[AVAudioSession sharedInstance]];

    // other coding stuff
}

// observe the interruption begin / end 
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
    AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
    AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];

    switch (interruptionType) {
        // 2. save recording on interruption begin
        case AVAudioSessionInterruptionTypeBegan:{
            // stop recording
            // Update the UI accordingly
            break;
        }
        case AVAudioSessionInterruptionTypeEnded:{
            if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
                // create a new recording
                // Update the UI accordingly
            }
            break;
        }

        default:
            break;
    }
}  

// 4. append all recordings
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
    // append all recordings one after other
}

这是一个工作示例:

//
//  XDRecordViewController.m
//
//  Created by S1LENT WARRIOR
//

#import "XDRecordViewController.h"

@interface XDRecordViewController ()
{
    AVAudioRecorder *recorder;

    __weak IBOutlet UIButton* btnRecord;
    __weak IBOutlet UIButton* btnSave;
    __weak IBOutlet UIButton* btnDiscard;
    __weak IBOutlet UILabel*  lblTimer; // a UILabel to display the recording time

    // some variables to display the timer on a lblTimer
    NSTimer* timer;
    NSTimeInterval intervalTimeElapsed;
    NSDate* pauseStart;
    NSDate* previousFireDate;
    NSDate* recordingStartDate;

    // interruption handling variables
    BOOL isInterrupted;
    NSInteger preInterruptionDuration;

    NSMutableArray* recordings; // an array of recordings to be merged in the end
}
@end

@implementation XDRecordViewController

- (void)viewDidLoad
{
    [super viewDidLoad];

    // Make this class listen to the AVAudioSessionInterruptionNotification
    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(handleAudioSessionInterruption:)
                                                 name:AVAudioSessionInterruptionNotification
                                               object:[AVAudioSession sharedInstance]];

    [self clearContentsOfDirectory:NSTemporaryDirectory()]; // clear contents of NSTemporaryDirectory()

    recordings = [NSMutableArray new]; // initialize recordings

    [self setupAudioSession]; // setup the audio session. you may customize it according to your requirements
}

- (void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];

    [self initRecording];   // start recording as soon as the view appears
}

- (void)dealloc
{
    [self clearContentsOfDirectory:NSTemporaryDirectory()]; // remove all files files from NSTemporaryDirectory

    [[NSNotificationCenter defaultCenter] removeObserver:self]; // remove this class from NSNotificationCenter
}

#pragma mark - Event Listeners

// called when recording button is tapped
- (IBAction) btnRecordingTapped:(UIButton*)sender
{
    sender.selected = !sender.selected; // toggle the button

    if (sender.selected) { // resume recording
        [recorder record];
        [self resumeTimer];
    } else { // pause recording
        [recorder pause];
        [self pauseTimer];
    }
}

// called when save button is tapped
- (IBAction) btnSaveTapped:(UIButton*)sender
{
    [self pauseTimer]; // pause the timer

    // disable the UI while the recording is saving so that user may not press the save, record or discard button again
    btnSave.enabled = NO;
    btnRecord.enabled = NO;
    btnDiscard.enabled = NO;

    [recorder stop]; // stop the AVAudioRecorder so that the audioRecorderDidFinishRecording delegate function may get called

    // Deactivate the AVAudioSession
    NSError* error;
    [[AVAudioSession sharedInstance] setActive:NO error:&error];
    if (error) {
        NSLog(@"%@", error);
    }
}

// called when discard button is tapped
- (IBAction) btnDiscardTapped:(id)sender
{
    [self stopTimer]; // stop the timer

    recorder.delegate = Nil; // set delegate to Nil so that audioRecorderDidFinishRecording delegate function may not get called
    [recorder stop];  // stop the recorder

    // Deactivate the AVAudioSession
    NSError* error;
    [[AVAudioSession sharedInstance] setActive:NO error:&error];
    if (error) {
        NSLog(@"%@", error);
    }

    [self.navigationController popViewControllerAnimated:YES];
}

#pragma mark - Notification Listeners
// called when an AVAudioSessionInterruption occurs
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
    AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
    AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];

    switch (interruptionType) {
        case AVAudioSessionInterruptionTypeBegan:{
            // • Recording has stopped, already inactive
            // • Change state of UI, etc., to reflect non-recording state
            preInterruptionDuration += recorder.currentTime; // time elapsed
            if(btnRecord.selected) {    // timer is already running
                [self btnRecordingTapped:btnRecord];  // pause the recording and pause the timer
            }

            recorder.delegate = Nil; // Set delegate to nil so that audioRecorderDidFinishRecording may not get called
            [recorder stop];    // stop recording
            isInterrupted = YES;
            break;
        }
        case AVAudioSessionInterruptionTypeEnded:{
            // • Make session active
            // • Update user interface
            // • AVAudioSessionInterruptionOptionShouldResume option
            if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
                // Here you should create a new recording
                [self initRecording];   // create a new recording
                [self btnRecordingTapped:btnRecord];
            }
            break;
        }

        default:
            break;
    }
}

#pragma mark - AVAudioRecorderDelegate
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
    [self appendAudiosAtURLs:recordings completion:^(BOOL success, NSURL *outputUrl) {
        // do whatever you want with the new audio file :)
    }];
}

#pragma mark - Timer
- (void)timerFired:(NSTimer*)timer
{
    intervalTimeElapsed++;
    [self updateDisplay];
}

// function to time string
- (NSString*) timerStringSinceTimeInterval:(NSTimeInterval)timeInterval
{
    NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval];
    NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
    [dateFormatter setDateFormat:@"mm:ss"];
    [dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]];
    return [dateFormatter stringFromDate:timerDate];
}

// called when recording pauses
- (void) pauseTimer
{
    pauseStart = [NSDate dateWithTimeIntervalSinceNow:0];

    previousFireDate = [timer fireDate];

    [timer setFireDate:[NSDate distantFuture]];
}

- (void) resumeTimer
{
    if (!timer) {
        timer = [NSTimer scheduledTimerWithTimeInterval:1.0
                                                 target:self
                                               selector:@selector(timerFired:)
                                               userInfo:Nil
                                                repeats:YES];
        return;
    }

    float pauseTime = - 1 * [pauseStart timeIntervalSinceNow];

    [timer setFireDate:[previousFireDate initWithTimeInterval:pauseTime sinceDate:previousFireDate]];
}

- (void)stopTimer
{
    [self updateDisplay];
    [timer invalidate];
    timer = nil;
}

- (void)updateDisplay
{
    lblTimer.text = [self timerStringSinceTimeInterval:intervalTimeElapsed];
}

#pragma mark - Helper Functions
- (void) initRecording
{

    // Set the audio file
    NSString* name = [NSString stringWithFormat:@"recording_%@.m4a", @(recordings.count)]; // creating a unique name for each audio file
    NSURL *outputFileURL = [NSURL fileURLWithPathComponents:@[NSTemporaryDirectory(), name]];

    [recordings addObject:outputFileURL];

    // Define the recorder settings
    NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];

    [recordSetting setValue:@(kAudioFormatMPEG4AAC) forKey:AVFormatIDKey];
    [recordSetting setValue:@(44100.0) forKey:AVSampleRateKey];
    [recordSetting setValue:@(1) forKey:AVNumberOfChannelsKey];

    NSError* error;
    // Initiate and prepare the recorder
    recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:&error];
    recorder.delegate = self;
    recorder.meteringEnabled = YES;
    [recorder prepareToRecord];

    if (![AVAudioSession sharedInstance].inputAvailable) { // can not record audio if mic is unavailable
        NSLog(@"Error: Audio input device not available!");
        return;
    }

    intervalTimeElapsed = 0;
    recordingStartDate = [NSDate date];

    if (isInterrupted) {
        intervalTimeElapsed = preInterruptionDuration;
        isInterrupted = NO;
    }

    // Activate the AVAudioSession
    [[AVAudioSession sharedInstance] setActive:YES error:&error];
    if (error) {
        NSLog(@"%@", error);
    }

    recordingStartDate = [NSDate date];  // Set the recording start date
    [self btnRecordingTapped:btnRecord];
}

- (void)setupAudioSession
{

    static BOOL audioSessionSetup = NO;
    if (audioSessionSetup) {
        return;
    }

    AVAudioSession* session = [AVAudioSession sharedInstance];

    [session setCategory:AVAudioSessionCategoryPlayAndRecord
             withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
                   error:Nil];

    [session setMode:AVAudioSessionModeSpokenAudio error:nil];

    audioSessionSetup = YES;
}

// gets an array of audios and append them to one another
// the basic logic was derived from here: http://stackoverflow.com/a/16040992/634958
// i modified this logic to append multiple files
- (void) appendAudiosAtURLs:(NSMutableArray*)urls completion:(void(^)(BOOL success, NSURL* outputUrl))handler
{
    // Create a new audio track we can append to
    AVMutableComposition* composition = [AVMutableComposition composition];
    AVMutableCompositionTrack* appendedAudioTrack =
    [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                             preferredTrackID:kCMPersistentTrackID_Invalid];

    // Grab the first audio track that need to be appended
    AVURLAsset* originalAsset = [[AVURLAsset alloc]
                                 initWithURL:urls.firstObject options:nil];
    [urls removeObjectAtIndex:0];

    NSError* error = nil;

    // Grab the first audio track and insert it into our appendedAudioTrack
    AVAssetTrack *originalTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration);
    [appendedAudioTrack insertTimeRange:timeRange
                                ofTrack:originalTrack
                                 atTime:kCMTimeZero
                                  error:&error];
    CMTime duration = originalAsset.duration;

    if (error) {
        if (handler) {
            dispatch_async(dispatch_get_main_queue(), ^{
                handler(NO, Nil);
            });
        }
    }

    for (NSURL* audioUrl in urls) {
        AVURLAsset* newAsset = [[AVURLAsset alloc]
                                initWithURL:audioUrl options:nil];

        // Grab the rest of the audio tracks and insert them at the end of each other
        AVAssetTrack *newTrack = [[newAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
        timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration);
        [appendedAudioTrack insertTimeRange:timeRange
                                    ofTrack:newTrack
                                     atTime:duration
                                      error:&error];

        duration = appendedAudioTrack.timeRange.duration;

        if (error) {
            if (handler) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    handler(NO, Nil);
                });
            }
        }
    }

    // Create a new audio file using the appendedAudioTrack
    AVAssetExportSession* exportSession = [AVAssetExportSession
                                           exportSessionWithAsset:composition
                                           presetName:AVAssetExportPresetAppleM4A];
    if (!exportSession) {
        if (handler) {
            dispatch_async(dispatch_get_main_queue(), ^{
                handler(NO, Nil);
            });
        }
    }

    NSArray* appendedAudioPath = @[NSTemporaryDirectory(), @"temp.m4a"]; // name of the final audio file
    exportSession.outputURL = [NSURL fileURLWithPathComponents:appendedAudioPath];
    exportSession.outputFileType = AVFileTypeAppleM4A;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{

        BOOL success = NO;
        // exported successfully?
        switch (exportSession.status) {
            case AVAssetExportSessionStatusFailed:
                break;
            case AVAssetExportSessionStatusCompleted: {
                success = YES;

                break;
            }
            case AVAssetExportSessionStatusWaiting:
                break;
            default:
                break;
        }

        if (handler) {
            dispatch_async(dispatch_get_main_queue(), ^{
                handler(success, exportSession.outputURL);
            });
        }
    }];
}

- (void) clearContentsOfDirectory:(NSString*)directory
{
    NSFileManager *fm = [NSFileManager defaultManager];
    NSError *error = nil;
    for (NSString *file in [fm contentsOfDirectoryAtPath:directory error:&error]) {
        [fm removeItemAtURL:[NSURL fileURLWithPathComponents:@[directory, file]] error:&error];
    }
}

@end

我知道现在回答问题为时已晚,但希望这对其他人有帮助!

于 2015-12-10T04:26:31.250 回答