嘿。我想用AVAudioEngine
Swift 中的新功能实现一个实时音频应用程序。有人对新框架有经验吗?实时应用程序如何工作?
我的第一个想法是将(处理过的)输入数据存储到一个AVAudioPCMBuffer
对象中,然后让它播放,AVAudioPlayerNode
正如您在我的演示类中看到的那样:
import AVFoundation
class AudioIO {
var audioEngine: AVAudioEngine
var audioInputNode : AVAudioInputNode
var audioPlayerNode: AVAudioPlayerNode
var audioMixerNode: AVAudioMixerNode
var audioBuffer: AVAudioPCMBuffer
init(){
audioEngine = AVAudioEngine()
audioPlayerNode = AVAudioPlayerNode()
audioMixerNode = audioEngine.mainMixerNode
let frameLength = UInt32(256)
audioBuffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0), frameCapacity: frameLength)
audioBuffer.frameLength = frameLength
audioInputNode = audioEngine.inputNode
audioInputNode.installTapOnBus(0, bufferSize:frameLength, format: audioInputNode.outputFormatForBus(0), block: {(buffer, time) in
let channels = UnsafeArray(start: buffer.floatChannelData, length: Int(buffer.format.channelCount))
let floats = UnsafeArray(start: channels[0], length: Int(buffer.frameLength))
for var i = 0; i < Int(self.audioBuffer.frameLength); i+=Int(self.audioMixerNode.outputFormatForBus(0).channelCount)
{
// doing my real time stuff
self.audioBuffer.floatChannelData.memory[i] = floats[i];
}
})
// setup audio engine
audioEngine.attachNode(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioMixerNode, format: audioPlayerNode.outputFormatForBus(0))
audioEngine.startAndReturnError(nil)
// play player and buffer
audioPlayerNode.play()
audioPlayerNode.scheduleBuffer(audioBuffer, atTime: nil, options: .Loops, completionHandler: nil)
}
}
但这与实时相去甚远,而且效率不高。有什么想法或经验吗?没关系,如果您更喜欢 Objective-C 或 Swift,我感谢所有注释、评论、评论、解决方案等。