0

我正在尝试以 pcmInt16 数据格式播放来自网络的字节数组中的音乐。

// formats
let format1 = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatFloat32, sampleRate: 48000, channels: 1, interleaved: false)!
let format2 = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatInt16, sampleRate: 48000, channels: 1, interleaved: false)!

// byte array buffer
var byteArray: [Int16]! // one packet size is 512

...
// 1. create / attach / connect engine
engine.prepare()
try! engine.start()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: format1)

// 2. fill byteArray with music stream // int16 48kHz 32bit
...

// 3.
var len = 512
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: format2, frameCapacity: AVAudioFrameCount(len))!

// HERE
// How to set the first 512 data from byteArray ?
playerNode.scheduleBuffer(pcmBuffer, completionHandler: nil)

如何设置 byteArray 的前 512 个数据?我尝试了这样的事情,但它不起作用: memcpy(pcmBuffer.audioBufferList.pointee.mBuffers.mData, byteArray[0..<512], len * 2)

4

2 回答 2

4

AVAudioMixerNode 适用于 sampleRate 转换,但对于像 Int16 到 Float 这样的广泛格式更改,您最好自己转换。为了性能,我建议使用 vDSP Accelerate。

import Cocoa
import AVFoundation
import Accelerate
import PlaygroundSupport


let bufferSize = 512
let bufferByteSize = MemoryLayout<Float>.size * bufferSize

var pcmInt16Data: [Int16] = []
var pcmFloatData = [Float](repeating: 0.0, count: bufferSize) // allocate once and reuse


// one buffer of noise as an example
for _ in 0..<bufferSize {
    let value = Int16.random(in: Int16.min...Int16.max)
    pcmInt16Data.append(value)
}


let engine = AVAudioEngine()
let player = AVAudioPlayerNode()

let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 48_000.0, channels: 1)!

let mixer = engine.mainMixerNode

engine.attach(player)
engine.connect(player, to: mixer, format: audioFormat)

engine.prepare()

do {
    try engine.start()
} catch {
    print("Error info: \(error)")
}

player.play()

if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(bufferSize)) {
    
    let monoChannel = buffer.floatChannelData![0]
    
    // Int16 ranges from -32768 to 32767 -- we want to convert and scale these to Float values between -1.0 and 1.0
    var scale = Float(Int16.max) + 1.0
    vDSP_vflt16(pcmInt16Data, 1, &pcmFloatData, 1, vDSP_Length(bufferSize)) // Int16 to Float
    vDSP_vsdiv(pcmFloatData, 1, &scale, &pcmFloatData, 1, vDSP_Length(bufferSize)) // divide by scale
    
    memcpy(monoChannel, pcmFloatData, bufferByteSize)
    buffer.frameLength = UInt32(bufferSize)
    player.scheduleBuffer(buffer, completionHandler: nil) // load more buffers in the completionHandler
    
}


PlaygroundPage.current.needsIndefiniteExecution = true

如果您想播放 AVAudioFile,请使用AVAudioPlayerNode.scheduleFile().scheduleSegment方法,而不是尝试直接从 WAV/AIFF 读取 Int16 数据。您需要注意 AVAudioFile.processingFormat 参数并将其用于从播放器到混音器的连接格式。

import Cocoa
import PlaygroundSupport
import AVFoundation


let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
    
let playEntireFile = true

func playLocalFile() {

    // file needs to be in ~/Documents/Shared Playground Data
    let localURL = playgroundSharedDataDirectory.appendingPathComponent("MyAwesomeMixtape6.aiff")
    guard let audioFile = try? AVAudioFile(forReading: localURL) else { return }
    let audioFormat = audioFile.processingFormat

    let mixer = engine.mainMixerNode

    engine.attach(player)
    engine.connect(player, to: mixer, format: audioFormat)

    engine.prepare()

    do {
        try engine.start()
    } catch {
        print("Error info: \(error)")
    }

    player.play()
    
    if playEntireFile {
        
        player.scheduleFile(audioFile, at: nil, completionHandler: nil)
            
    } else { // play segment
        
        let startTimeSeconds = 5.0
        let durationSeconds = 2.0
        
        let sampleRate = audioFormat.sampleRate
        let startFramePostion = startTimeSeconds * sampleRate
        let durationFrameCount = durationSeconds * sampleRate
        
        player.scheduleSegment(audioFile, startingFrame: AVAudioFramePosition(startFramePostion), frameCount: AVAudioFrameCount(durationFrameCount), at: nil, completionHandler: nil)
        
    }
    
}

playLocalFile()


PlaygroundPage.current.needsIndefiniteExecution = true

对于远程文件,请尝试 AVPlayer。

import Cocoa
import AVFoundation
import PlaygroundSupport


var player: AVPlayer?

func playRemoteFile() {

    guard let remoteURL = URL(string: "https://ondemand.npr.org/anon.npr-mp3/npr/me/2020/03/20200312_me_singapore_wins_praise_for_its_covid-19_strategy_the_us_does_not.mp3"
        ) else { return }
    
    player = AVPlayer(url: remoteURL)

    player?.play()

}

playRemoteFile()

PlaygroundPage.current.needsIndefiniteExecution = true
于 2020-03-12T03:27:31.030 回答
0

首先,您最好不要尽可能使用隐式展开的选项。

var byteArray: [Int16] = [] // one packet size is 512

据我从您显示的代码中可以看出,没有必要将其设为byteArray可选。


以及如何设置byteArray的前512 个数据?

您的代码只需稍作修改即可工作:

pcmBuffer.frameLength = AVAudioFrameCount(len)
memcpy(pcmBuffer.audioBufferList.pointee.mBuffers.mData, byteArray, len * 2)

或者您可以使用int16ChannelData

if let channelData = pcmBuffer.int16ChannelData {
    memcpy(channelData[0], byteArray, len * MemoryLayout<Int16>.stride)
    pcmBuffer.frameLength = AVAudioFrameCount(len)
} else {
    print("bad format")
}

你可能想加载你的非第一部分byteArray,但这是另一个问题。

于 2020-03-11T19:29:27.910 回答