我看到 iOS 9/10 和 iOS 11 之间的缓冲区行为不同,当音频路由发生更改时(例如,您插入耳机),这些缓冲区将在 AVAudioPlayerNode 上调度。有没有人经历过类似的事情,你是如何解决的?请注意,我大约两周前在 Apple 的 AVFoundation 支持论坛上报告了这个问题,并且收到的回复完全为零。
显示此问题的代码如下所示 - 首先简要说明:该代码是一个简单的循环,它重复调度缓冲区在将来的某个时间播放。该过程通过调用“runSequence”方法开始,该方法安排音频缓冲区在将来的某个时间播放,并将完成回调设置为嵌套方法“audioCompleteHandler”。完成回调再次调用“runSequence”方法,该方法调度另一个缓冲区并保持进程永远进行。在这种情况下,除非完成处理程序正在执行,否则总是会安排一个缓冲区。各个地方的'trace'方法是内部方法,仅在调试时打印,所以可以忽略。
在音频路由更改通知处理程序 (handleAudioRouteChange) 中,当新设备可用时(案例 .newDeviceAvailable),代码会重新启动引擎和播放器,重新激活音频会话并调用“runSequence”以使循环恢复正常。
这一切在 iOS 9.3.5 (iPhone 5C) 和 iOS 10.3.3 (iPhone 6) 上运行良好,但在 iOS 11.1.1 (iPad Air) 上失败。失败的本质是 AVAudioPlayerNode 不播放音频,而是立即调用完成处理程序。这会导致失控的情况。如果我再次删除启动循环的行(如代码中所示),它在 iOS 11.1.1 上运行良好,但在 iOS 9.3.5 和 iOS 10.3.3 上失败。这个失败是不同的:音频刚刚停止,在调试器中,我可以看到循环没有循环。
因此,一种可能的解释是,在 iOS 9.x 和 iOS 10.x 下,未来计划的缓冲区在发生音频路由更改时是非计划的,而在 iOS 11.x 下,未来计划的缓冲区不是计划外的。
这导致了两个问题: 1. 有没有人看到任何类似的行为?解决方法是什么?2. 当音频路由改变(或音频中断)发生时,谁能指出描述引擎、播放器和会话的确切状态的文档?
private func runSequence() {
// For test ony
var timeBaseInfo = mach_timebase_info_data_t()
mach_timebase_info(&timeBaseInfo)
// End for test only
let audioCompleteHandler = { [unowned self] in
DispatchQueue.main.async {
trace(level: .skim, items: "Player: \(self.player1.isPlaying), Engine: \(self.engine.isRunning)")
self.player1.stop()
switch self.runStatus {
case .Run:
self.runSequence()
case .Restart:
self.runStatus = .Run
self.tickSeq.resetSequence()
//self.updateRenderHostTime()
self.runSequence()
case .Halt:
self.stopEngine()
self.player1.stop()
self.activateAudioSession(activate: false)
}
}
}
// Schedule buffer...
if self.engine.isRunning {
if let thisElem: (buffer: AVAudioPCMBuffer, duration: Int) = tickSeq.next() {
self.player1.scheduleBuffer(thisElem.buffer, at: nil, options: [], completionHandler: audioCompleteHandler)
self.player1.prepare(withFrameCount: thisElem.buffer.frameLength)
self.player1.play(at: AVAudioTime(hostTime: self.startHostTime))
self.startHostTime += AVAudioTime.hostTime(forSeconds: TimeInterval(Double(60.0 / Double(self.model.bpm.value)) * Double(thisElem.duration)))
trace(level: .skim, items:
"Samples: \(thisElem.buffer.frameLength)",
"Time: \(mach_absolute_time() * (UInt64(timeBaseInfo.numer) / UInt64(timeBaseInfo.denom))) ",
"Sample Time: \(player1.lastRenderTime!.hostTime)",
"Play At: \(self.startHostTime) ",
"Player: \(self.player1.isPlaying)",
"Engine: \(self.engine.isRunning)")
}
else {
}
}
}
@objc func handleAudioRouteChange(_ notification: Notification) {
trace(level: .skim, items: "Route change: Player: \(self.player1.isPlaying) Engine: \(self.engine.isRunning)")
guard let userInfo = notification.userInfo,
let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else { return }
trace(level: .skim, items: audioSession.currentRoute, audioSession.mode)
trace(level: .none, items: "Reason Value: \(String(describing: userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt)); Reason: \(String(describing: AVAudioSessionRouteChangeReason(rawValue:reasonValue)))")
switch reason {
case .newDeviceAvailable:
trace(level: .skim, items: "In handleAudioRouteChange.newDeviceAvailable")
for output in audioSession.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
startEngine()
player1.play()
activateAudioSession(activate: true)
//updateRenderHostTime()
runSequence() // <<--- Problem: works for iOS9,10; fails on iOS11. Remove it and iOS9,10 fail, works on iOS11
}
case .oldDeviceUnavailable:
trace(level: .skim, items: "In handleAudioRouteChange.oldDeviceUnavailable")
if let previousRoute =
userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
player1.stop()
stopEngine()
tickSeq.resetSequence()
DispatchQueue.main.async {
if let pp = self.playPause as UIButton? { pp.isSelected = false }
}
}
}