5

我正在构建一个必须跟踪用户麦克风输入幅度的应用程序。AudioKit 有一堆方便的对象来满足我的需要:AKAmplitudeTracker 等等。我还没有找到任何关于它应该如何启动 AudioKit、开始跟踪等的可行信息。

现在所有与 AudioKit 初始化相关的代码都在我的录音机模块根 VC 的 viewDidLoad 方法中。这是不正确的,因为会发生随机错误,我无法跟踪问题所在。下面的代码显示了我现在如何使用 AudioKit。

var silence: AKBooster!
  var tracker: AKAmplitudeTracker!
    var mic: AKMicrophone!

      ...

      override func viewDidLoad() {
        super.viewDidLoad()

        switch AVAudioSession.sharedInstance().recordPermission() {

            case AVAudioSessionRecordPermission.granted:

              self.mic = AKMicrophone()
              self.tracker = AKAmplitudeTracker(self.mic)
              AKSettings.audioInputEnabled = true
              AudioKit.output = self.tracker
              AudioKit.start()
              self.mic.start()
              self.tracker.start()

              break

            case AVAudioSessionRecordPermission.undetermined:

              AVAudioSession.sharedInstance().requestRecordPermission {
                (granted) in

                if granted {

                  self.mic = AKMicrophone()
                  self.tracker = AKAmplitudeTracker(self.mic)
                  AKSettings.audioInputEnabled = true
                  AudioKit.output = self.tracker
                  AudioKit.start()
                  self.mic.start()
                  self.tracker.start()

                }

              }
            case AVAudioSessionRecordPermission.denied:

              AVAudioSession.sharedInstance().requestRecordPermission {
                (granted) in

                if granted {

                  self.mic = AKMicrophone()
                  self.tracker = AKAmplitudeTracker(self.mic)
                  AKSettings.audioInputEnabled = true
                  AudioKit.output = self.tracker
                  AudioKit.start()
                  self.mic.start()
                  self.tracker.start()

                }

              }


            default:
              print("")
          }

          ...

      }

请帮我弄清楚如何正确管理 AudioKit。

4

2 回答 2

3

阿列克谢,

我对管理 AudioKit 生命周期的建议是把它放在一个单例类中。这就是它在 repo 中包含的一些 AudioKit 示例中的设置方式,例如Analog Synth XDrums。这样,它就不会绑定到特定的 ViewController viewDidLoad,并且可以从多个 ViewController 或管理应用程序状态的 AppDelegate 访问。它还确保您只会创建它的一个实例。

这是一个在名为的类中初始化 AudioKit 的示例Conductor(也可以称为AudioManager等):

import AudioKit
import AudioKitUI

// Treat the conductor like a manager for the audio engine.
class Conductor {

    // Singleton of the Conductor class to avoid multiple instances of the audio engine
    static let sharedInstance = Conductor()

    // Create instance variables
    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    // Add effects
    var delay: AKDelay!
    var reverb: AKCostelloReverb!

    // Balance between the delay and reverb mix.
    var reverbAmountMixer = AKDryWetMixer()

    init() {

        // Allow audio to play while the iOS device is muted.
        AKSettings.playbackWhileMuted = true

        AKSettings.defaultToSpeaker = true

        // Capture mic input
        mic = AKMicrophone()

        // Pull mic output into the tracker node.
        tracker = AKAmplitudeTracker(mic)

        // Pull the tracker output into the delay effect node.
        delay = AKDelay(tracker)
        delay.time = 2.0
        delay.feedback = 0.1
        delay.dryWetMix = 0.5

        // Pull the delay output into the reverb effect node.
        reverb = AKCostelloReverb(delay)
        reverb.presetShortTailCostelloReverb()

        // Mix the amount of reverb to the delay output node.
        reverbAmountMixer = AKDryWetMixer(delay, reverb, balance: 0.8)

        // Assign the reverbAmountMixer output to be the final audio output
        AudioKit.output = reverbAmountMixer

        // Start the AudioKit engine
        // This is in its own method so that the audio engine will start and stop via the AppDelegate's current state.
        startAudioEngine()

    }

    internal func startAudioEngine() {
        AudioKit.start()
        print("Audio engine started")
    }

    internal func stopAudioEngine() {
        AudioKit.stop()
        print("Audio engine stopped")
    }
}

以下是如何Conductor从 ViewController 访问单音类中发生的幅度跟踪数据:

import UIKit

class ViewController: UIViewController {

    var conductor = Conductor.sharedInstance

    override func viewDidLoad() {
        super.viewDidLoad()

        Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true) { [unowned self] (timer) in
            print(self.conductor.tracker.amplitude)
        }

    }
}

您可以从此处下载此 GitHub 存储库:

https://github.com/markjeschke/AudioKit-Amplitude-Tracker

我希望这有帮助。

保重,
马克

于 2017-10-04T21:13:49.107 回答
2

从我所见,它看起来应该可以工作,您的代码中的其他地方可能正在发生一些事情。我做了一个精简的演示来测试基础知识,它可以工作。我刚刚添加了一个计时器来轮询幅度。

import UIKit
import AudioKit

class ViewController: UIViewController {

    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    override func viewDidLoad() {
        super.viewDidLoad()

        mic = AKMicrophone()
        tracker = AKAmplitudeTracker(mic)
        AudioKit.output = tracker
        AudioKit.start()

        Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { (timer) in
            print(self.tracker.amplitude)
        }
    }
}
于 2017-05-30T16:05:13.617 回答