13

我有一个示例应用程序,它使用 AudioKit 来录制音频并显示该音频数据的波形。这个示例应用程序有两个视图控制器,根 vc 是一个空白页面,带有一个按钮,可以将用户带到录音页面。

出于某种原因,仅在 iPhone X (iOS 11.4.1) 上,在录制音频时,如果我点击导航栏(左上角)上的后退按钮,然后尝试再次录制,应用程序将崩溃。

具体来说,当记录器的方法appendDataFromBufferList: withBufferSize:调用ExtAudioFileWrite(self.info->extAudioFileRef, bufferSize, bufferList). 控制台中打印的错误消息是:

testAudioCrash(1312,0x16e203000) malloc: * **对象 0x109803a00 的错误:已释放对象的校验和不正确 - 对象可能在被释放后被修改。 * **在 malloc_error_break 中设置断点进行调试

我已经完成了僵尸分析、泄漏分析、逻辑和堆栈,但我似乎无法弄清楚为什么会发生这种情况。

下面我提供了测试应用程序的代码以及堆栈和控制台输出的屏幕截图。任何有助于弄清楚为什么会崩溃的帮助将不胜感激。不幸的是,这次崩溃也不是 100% 可重现的,这让我觉得有点模糊。

下面代码的注释:.h 文件中没有自定义代码,所以我没有提供。每个视图控制器都有用于此的 UI 组件的 xib 文件。它们非常简单,因此我也没有提供有关这些的信息,尽管我在提供任何人要求的有关它们的任何信息方面都没有问题。如果有人觉得有必要,我也可以压缩项目并分享。

复制步骤:1)启动应用程序 2)点击录制音频按钮 3)点击录制按钮 4)点击导航栏上的返回按钮

5) 重复步骤 2-4 直到发生崩溃

AppDelegate.m 代码:

#import "AppDelegate.h"
#import "testViewController.h"

@interface AppDelegate ()
@end

@implementation AppDelegate


- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    // Override point for customization after application launch.

    testViewController* rootVC = [[testViewController alloc] initWithNibName: @"testViewController" bundle: NSBundle.mainBundle];
    UINavigationController* nav = [[UINavigationController alloc] initWithRootViewController: rootVC];
    self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
    self.window.rootViewController = nav;
    [self.window makeKeyAndVisible];
    return YES;
}


- (void)applicationWillResignActive:(UIApplication *)application {
    // Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
    // Use this method to pause ongoing tasks, disable timers, and invalidate graphics rendering callbacks. Games should use this method to pause the game.
}


- (void)applicationDidEnterBackground:(UIApplication *)application {
    // Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
    // If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
}


- (void)applicationWillEnterForeground:(UIApplication *)application {
    // Called as part of the transition from the background to the active state; here you can undo many of the changes made on entering the background.
}


- (void)applicationDidBecomeActive:(UIApplication *)application {
    // Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
}


- (void)applicationWillTerminate:(UIApplication *)application {
    // Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
}


@end

testViewController.m 代码:

#import "testViewController.h"
#import "testSecondViewController.h"

@interface testViewController ()

@end

@implementation testViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view from its nib.
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

- (IBAction)AudioRecording:(id)sender
{
    testSecondViewController* sVC = [[testSecondViewController alloc] initWithNibName: @"testSecondViewController" bundle: NSBundle.mainBundle];
    [self.navigationController pushViewController: sVC animated: YES];
}

@end

testSecondViewController.m 代码:

#import "testSecondViewController.h"
@import AudioKit;
@import AudioKitUI;

@interface testSecondViewController () <EZMicrophoneDelegate, EZRecorderDelegate>
@property (nonatomic, strong) EZRecorder* recorder;
@property (nonatomic, strong) EZMicrophone* mic;
@property (nonatomic, strong) EZAudioPlayer* player;
@property (strong, nonatomic) IBOutlet EZAudioPlot *audioPlot;
@property (nonatomic, strong) NSURL *finishedRecordingURL;
@property (atomic, assign) BOOL isRecording;

@end

@implementation testSecondViewController

- (void)dealloc
{
    if(_isRecording) [self pauseRecording: _mic];
    if(_recorder) [self finalizeAudioFile: _recorder];
    _recorder.delegate = nil;
    _mic.delegate = nil;
}

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    [EZAudioUtilities setShouldExitOnCheckResultFail: NO];

    [self setupUI];
    [self setupConfig];
    [self audioKitSetup];
}


- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

#pragma mark UI Methods
-(void)setupUI
{
    self.navigationItem.rightBarButtonItem = [[UIBarButtonItem alloc] initWithTitle:@"Cancel" style: UIBarButtonItemStylePlain target: nil action:@selector(cancelButtonClicked)];
    [self configureWaveFormViewForAudioInput];
}

-(void)setupConfig
{
    [self initializeMic];
    [self initializeRecorder];
}

-(void)initializeMic
{
    self.mic = [[EZMicrophone alloc] initWithMicrophoneDelegate: self];
    self.isRecording = NO;
}

-(void)initializeRecorder
{
    NSURL *fileUrl = [self testFilePathURL];
    self.finishedRecordingURL = fileUrl;

    self.recorder = [[EZRecorder alloc] initWithURL: fileUrl clientFormat: [self.mic audioStreamBasicDescription] fileType: EZRecorderFileTypeM4A delegate: self];
}

#pragma mark - Utils
- (NSArray *)applicationDocuments
{
  return NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
}


- (NSString *)applicationDocumentsDirectory
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
    return basePath;
}

- (NSURL *)testFilePathURL
{
    self.finishedRecordingURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/%@",
                               [self applicationDocumentsDirectory],
                               @"test2.m4a"]];

    if (self.finishedRecordingURL && [[NSFileManager defaultManager] fileExistsAtPath:self.finishedRecordingURL.path])
    {
        NSError *error;
        [[NSFileManager defaultManager] removeItemAtURL:self.finishedRecordingURL error:&error];
        if(error){
            printf("%s", error.description);
        }
    }

    return self.finishedRecordingURL;
}

#pragma mark AudioKit Util methods
- (void) audioKitSetup
{
    [AKSettings setDefaultToSpeaker: YES];
    [AKSettings setAudioInputEnabled: YES];
    [AKSettings setPlaybackWhileMuted: YES];
    [AKSettings setSampleRate: 44100];
    [AKSettings setChannelCount: 1];
}

- (void) configureWaveFormViewForAudioInput
{
//    self.audioPlot.gain = 6;
//    self.audioPlot.color = [UIColor blueColor];
    self.audioPlot.plotType = EZPlotTypeRolling;
//    self.audioPlot.shouldFill = YES;
//    self.audioPlot.shouldMirror = YES;
    [self.view addSubview: self.audioPlot];
    self.audioPlot.clipsToBounds = YES;
}

- (IBAction)startRecording:(id)sender
{
    if (!self.mic)
    {
        self.mic = [EZMicrophone microphoneWithDelegate: self];
    }

    if (!self.recorder)
    {
        if (self.finishedRecordingURL && [[NSFileManager defaultManager] fileExistsAtPath:self.finishedRecordingURL.path])
        {
            self.recorder = [EZRecorder recorderWithURL: self.finishedRecordingURL clientFormat: [self.mic audioStreamBasicDescription] fileType: EZRecorderFileTypeM4A delegate: self];
        }
        else
        {
            self.recorder = [EZRecorder recorderWithURL: [self testFilePathURL] clientFormat: [self.mic audioStreamBasicDescription] fileType: EZRecorderFileTypeM4A delegate: self];
            self.finishedRecordingURL = self.recorder.url;
        }
    }

    [self.mic startFetchingAudio];
    self.isRecording = YES;
}

- (IBAction)pauseRecording:(id)sender
{
    [self.mic stopFetchingAudio];
    self.isRecording = NO;
}

- (void) finalizeAudioFile: (EZRecorder*) recorder
{
    if (self.isRecording)
    {
        [self.mic stopFetchingAudio];
    }

    [recorder closeAudioFile];
}

- (IBAction)cancelButtonClicked:(id)sender
{
        if(self.isRecording)
    {
        [self pauseRecording: self.mic];
    }

    UIAlertController *alert = [UIAlertController alertControllerWithTitle: @"Delete recording?" message:@"Would you like to delete your audio recording and stop recording?" preferredStyle: UIAlertControllerStyleAlert];

        UIAlertAction* yesButton = [UIAlertAction
                                actionWithTitle:@"Discard"
                                style:UIAlertActionStyleDefault
                                handler:^(UIAlertAction * action) {

                                    [self finalizeAudioFile: self.recorder];

                                    NSError *error;
                                    [[NSFileManager defaultManager] removeItemAtURL:self.finishedRecordingURL error:&error];
                                    if(error){
                                        printf("%s", error.description);
                                    }

                                     [self dismissViewControllerAnimated:YES completion:NULL];
                                }];

        UIAlertAction* noButton = [UIAlertAction
                               actionWithTitle:@"Cancel"
                               style:UIAlertActionStyleDefault
                               handler:^(UIAlertAction * action) {
                                   [alert dismissViewControllerAnimated:YES completion: nil];
                               }];

                                   [alert addAction:yesButton];
    [alert addAction:noButton];

    [self presentViewController:alert animated:YES completion:nil];
}

#pragma mark - EZMicrophone Delegate methods
- (void)  microphone:(EZMicrophone *)microphone
    hasAudioReceived:(float **)buffer
      withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels
{
    __weak typeof (self) weakling = self;
    dispatch_async(dispatch_get_main_queue(), ^{
        [weakling.audioPlot updateBuffer:buffer[0]
                          withBufferSize:bufferSize];
    });
}

- (void)  microphone:(EZMicrophone *)microphone
       hasBufferList:(AudioBufferList *)bufferList
      withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels
{
    if (self.isRecording)
    {
        [self.recorder appendDataFromBufferList:bufferList
                                 withBufferSize:bufferSize];
    }
}

- (void)microphone:(EZMicrophone *)microphone changedPlayingState:(BOOL)isPlaying
{
    self.isRecording = isPlaying;
}

@end

图片: 在此处输入图像描述

在此处输入图像描述

4

0 回答 0