1

这篇文章也发布在The Amazing Audio Engine 论坛上

大家好,我是 The Amazing Audio Engine 和 iOS 开发的新手,并且一直在尝试弄清楚如何获得曲目的 BPM。

到目前为止,我在论坛上找到了两篇关于离线渲染的文章:

  1. http://forum.theamazingaudioengine.com/discussion/comment/1743/#Comment_1743
  2. http://forum.theamazingaudioengine.com/discussion/comment/649#Comment_649

据我所知,该AEAudioControllerRenderMainOutput功能仅在fork 中正确实现。

我正在尝试进行离线渲染以处理轨道,然后使用此处描述的算法(JavaScript)并在此处实现。

到目前为止,我正在加载这个 fork,并且我正在使用 Swift(我目前是 Make School Summer Academy 的一员,该学院教授 Swift)。


播放曲目时,此代码适用于我(无离线渲染!)

let file = NSBundle.mainBundle().URLForResource("track", withExtension: 
"m4a")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)


audioController = AEAudioController(audioDescription: AEAudioController.nonInterleavedFloatStereoAudioDescription())

let receiver = AEBlockAudioReceiver { (source, time, frames, audioBufferList) -> Void in

    let leftSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData)
    // Advance the buffer sizeof(float) * 512
    let rightSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData) + 512

    println("leftSamples: \(leftSamples) rightSamples: \(rightSamples)")


}

audioController.addChannels([channel])
audioController.addOutputReceiver(receiver)

audioController.start()

尝试离线渲染

这是我在使用这个fork时尝试运行的代码

audioController = AEAudioController(audioDescription: AEAudioController.nonInterleaved16BitStereoAudioDescription())

let file = NSBundle.mainBundle().URLForResource("track", withExtension: "mp3")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)

audioController.addChannels([channel])
audioController.start(nil)
audioController.stop()

var t = AudioTimeStamp()
let bufferLength: UInt32 = 4096
var buffer = AEAllocateAndInitAudioBufferList(audioController.audioDescription, Int32(bufferLength))
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer)

var renderDuration: NSTimeInterval = channel.duration
var sampleRate: Float64 = audioController.audioDescription.mSampleRate
var lengthInFrames: UInt32 = UInt32(renderDuration * sampleRate)
var songBuffer: [Float64]

t.mFlags = UInt32(kAudioTimeStampSampleTimeValid)
var frequencyAnalyzer = FrequencyAnalyzer()

println("renderDuration \(renderDuration)")

var outIsOpen = Boolean()

AUGraphClose(audioController.audioGraph)

AUGraphIsOpen(audioController.audioGraph, &outIsOpen)

println("AUGraphIsOpen: \(outIsOpen)")

for (var i: UInt32 = 0; i < lengthInFrames; i += bufferLength) {
    AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer);
    t.mSampleTime += Float64(bufferLength)

    println(t.mSampleTime)
    let leftSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData)
    let rightSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData) + 512
    println("leftSamples: \(leftSamples.memory) rightSamples: \(rightSamples.memory)")
}



AEFreeAudioBufferList(buffer)


AUGraphOpen(audioController.audioGraph)
audioController.start(nil)
audioController.stop()

离线渲染不适用于我的 ATM。第二个例子不起作用,它给我带来了很多我不明白的混合错误。

一个非常常见的是channelAudioProducer在这一行的函数内部:

// Tell mixer/mixer's converter unit to render into audio status = AudioUnitRender(group->converterUnit ? group->converterUnit : group->mixerAudioUnit, arg->ioActionFlags, &arg->originalTimeStamp, 0, *frames, audio);

它给了我EXC_BAD_ACCESS (code=EXC_I386_GPFLT)。在其他错误中,这个错误非常常见。

对不起,我在这个领域完全是个菜鸟,但有些东西我不太明白。我应该使用nonInterleaved16BitStereoAudioDescriptionornonInterleavedFloatStereoAudioDescription吗?这是如何实现的mData

我很想在这方面得到一些帮助,因为我现在有点迷路了。当你回答我时,请尽可能充分地解释它,我是这个东西的新手。

注意:如果您不了解 Swift,可以在 Objective-C 中发布代码。

4

0 回答 0