0

我正在制作一个播放音频文件的 Swift iOS 应用程序,并且为此使用了 The Amazing Audio Engine 2 库。

我决定使用单例模式来管理这个任务。在他的 Swift 示例项目中,lib 的创建者 Michael Tyson 在 Objective-C 中完成了他的音频部分(如果我理解的话,用于具有 C 函数的内存管理内容)。你可以在这里找到他的解释视频:https ://www.youtube.com/watch?v=OZQT4IGS8mA 所以我跟随他的脚步,用 Objective-C 编写了我的 Singleton。

这是我的代码:

音频管理器.h

#import <Foundation/Foundation.h>
#import "TheAmazingAudioEngine/TheAmazingAudioEngine.h"

@class Track;

@interface AudioManager : NSObject {
    NSString *test;
}

@property (nonatomic, strong) AEAudioUnitOutput  *_Nonnull output;
@property (nonatomic, strong) AERenderer *_Nonnull renderer;
@property (nonatomic, strong) AEAudioFilePlayerModule *_Nullable currentTrackModule;
@property (nonatomic) BOOL recording;
@property (nonatomic) Track *_Nullable currentTrack;

+ (_Nonnull id)defaultAudioManager;
- (BOOL)startAudioController:(NSError *_Nullable *_Nullable)error;
- (void)stopAudioController;
- (void)prepareTrackModule;
- (void)playCurrentTrack:(BOOL)loop;

@end

音频管理器.m

#import "AudioManager.h"
#import "Project-Swift.h"
@import AVFoundation;

@implementation AudioManager

//@synthesize someProperty;

#pragma mark Singleton Methods

+ (id)defaultAudioManager {
    static AudioManager *defaultAudioManager = nil;
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        defaultAudioManager = [[self alloc] init];
    });
    return defaultAudioManager;
}

- (id)init {
    if ( !(self = [super init]) ) return nil;
    //someProperty = @"Default Property Value";
    self.renderer = [AERenderer new];
    self.output = [[AEAudioUnitOutput alloc] initWithRenderer:self.renderer];

    return self;
}

- (void)dealloc {
    // Should never be called, but just here for clarity really.
}

- (BOOL)startAudioController:(NSError * _Nullable * _Nullable)error {
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:NULL];

    return [self.output start:error];
}

- (void)stopAudioController {
    return [self.output stop];
}

- (void)prepareTrackModule {
    self.currentTrackModule = [[AEAudioFilePlayerModule alloc] initWithRenderer:self.renderer
                                                  URL:[[NSBundle mainBundle]
                                                       URLForResource:@"amen"//self.currentTrack.name
                                                       withExtension:@"m4a"]
                                                error:NULL];
}

- (void)playCurrentTrack:(BOOL)loop {
    self.currentTrackModule.loop = loop;
    [self.currentTrackModule playAtTime:0];
    __unsafe_unretained typeof(self) weakSelf = self;
    self.renderer.block = ^(const AERenderContext *context) {
        AEModuleProcess(weakSelf.currentTrackModule, context);

        AEBufferStackMixToBufferList(context->stack, 1, 0, YES, context->output);
    };
}

@end

AppDelegate.Swift

import UIKit
import CoreData

@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {

    var window: UIWindow?

    func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool {
        // Override point for customization after application launch.

        do {
            try AudioManager.defaultAudioManager().startAudioController()
            //AudioManager.defaultAudioManager().prepareTrackModule()
            //AudioManager.defaultAudioManager().playCurrentTrack(true)
        } catch {
            //Handle Error
            print("error")
        }

        return true
    }

    func applicationWillResignActive(application: UIApplication) {
        // Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
        // Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game.
    }

    func applicationDidEnterBackground(application: UIApplication) {
        // Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
        // If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
    }

    func applicationWillEnterForeground(application: UIApplication) {
        // Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background.
    }

    func applicationDidBecomeActive(application: UIApplication) {
        // Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
    }

    func applicationWillTerminate(application: UIApplication) {
        // Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
        // Saves changes in the application's managed object context before the application terminates.
        self.saveContext()
    }
}

try AudioManager.defaultAudioManager().startAudioController()如果我在文件中注释该行AppDelegate.swift,显然一切正常,否则,当我尝试编译时,Xcode 会打印出这个非常有用的错误:

0  swift                    0x000000010411d4eb llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 43
1  swift                    0x000000010411c7d6 llvm::sys::RunSignalHandlers() + 70
2  swift                    0x000000010411db4f SignalHandler(int) + 287
3  libsystem_platform.dylib 0x00007fff9a64352a _sigtramp + 26
4  libsystem_platform.dylib 0x0000000000000001 _sigtramp + 1704708849
5  swift                    0x0000000101fba410 (anonymous namespace)::IRGenSILFunction::visitFullApplySite(swift::FullApplySite) + 2736
6  swift                    0x0000000101fa7c4b swift::irgen::IRGenModule::emitSILFunction(swift::SILFunction*) + 9787
7  swift                    0x0000000101f02fd8 swift::irgen::IRGenModuleDispatcher::emitGlobalTopLevel() + 600
8  swift                    0x0000000101f8ea5e performIRGeneration(swift::IRGenOptions&, swift::ModuleDecl*, swift::SILModule*, llvm::StringRef, llvm::LLVMContext&, swift::SourceFile*, unsigned int) + 1278
9  swift                    0x0000000101f8ef06 swift::performIRGeneration(swift::IRGenOptions&, swift::SourceFile&, swift::SILModule*, llvm::StringRef, llvm::LLVMContext&, unsigned int) + 70
10 swift                    0x0000000101e72a1c performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&) + 15004
11 swift                    0x0000000101e6e41d frontend_main(llvm::ArrayRef<char const*>, char const*, void*) + 2781
12 swift                    0x0000000101e69e3c main + 1932
13 libdyld.dylib            0x00007fff996215ad start + 1
14 libdyld.dylib            0x00000000000000bc start + 1721625360
Stack dump:

Command failed due to signal: Segmentation fault: 11

我认为我在 Objective-C 部分中遗漏了一些我不理解的东西,我对这种语言不是很熟悉,但我不知道是什么。任何想法 ?

4

1 回答 1

0

事实上,如果我删除 startAudioController 函数的参数,segfault 就消失了。我不知道为什么,但这可能是一个临时解决方案

于 2016-04-19T08:43:20.570 回答