3

我正在编写一个 iPhone 应用程序,该应用程序根据Apple 的建议使用 I/O 音频单元同时录制和播放音频。

我想在播放之前对录制的音频应用一些音效(混响等)。为了使这些效果正常工作,我需要样本是浮点数,而不是整数。AudioStreamBasicDescription通过创建一个with kAudioFormatFlagIsFloatset on ,这似乎应该是可能的mFormatFlags。这就是我的代码的样子:

AudioStreamBasicDescription streamDescription;

streamDescription.mSampleRate = 44100.0;
streamDescription.mFormatID = kAudioFormatLinearPCM;
streamDescription.mFormatFlags = kAudioFormatFlagIsFloat;
streamDescription.mBitsPerChannel = 32;
streamDescription.mBytesPerFrame = 4;
streamDescription.mBytesPerPacket = 4;
streamDescription.mChannelsPerFrame = 1;
streamDescription.mFramesPerPacket = 1;
streamDescription.mReserved = 0;

OSStatus status;

status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &streamDescription, sizeof(streamDescription));
if (status != noErr)
  fprintf(stderr, "AudioUnitSetProperty (kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input) returned status %ld\n", status);

status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &streamDescription, sizeof(streamDescription));
if (status != noErr)
  fprintf(stderr, "AudioUnitSetProperty (kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output) returned status %ld\n", status);

但是,当我运行它时(在运行 iPhoneOS 3.1.3 的 iPhone 3GS 上),我得到了这个:

AudioUnitSetProperty (kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input) returned error -10868
AudioUnitSetProperty (kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output) returned error -10868

(-10868 是 的值kAudioUnitErr_FormatNotSupported

除了建议坚持使用 16 位 little-endian 整数之外,我在 Apple 的文档中没有发现任何有价值的东西。但是,aurioTouch示例项目至少包含一些与kAudioFormatFlagIsFloat.

那么,我的流描述是不正确的,还是kAudioFormatFlagIsFloatiPhoneOS 根本不支持?

4

5 回答 5

4

据我所知,不支持。尽管使用 AudioConverter,您可以很容易地转换为浮点数。我实时进行此转换(两种方式)以将 Accelerate 框架与 iOS 音频一起使用。(注:此代码是从更模块化的代码中复制和粘贴的,因此可能会有一些小错别字)

首先,您需要输入中的 AudioStreamBasicDescription。说

AudioStreamBasicDescription aBasicDescription = {0};
aBasicDescription.mSampleRate       = self.samplerate;
aBasicDescription.mFormatID         = kAudioFormatLinearPCM;
aBasicDescription.mFormatFlags      = kAudioFormatFlagIsSignedInteger |     kAudioFormatFlagIsPacked;
aBasicDescription.mFramesPerPacket          = 1;
aBasicDescription.mChannelsPerFrame     = 1;
aBasicDescription.mBitsPerChannel       = 8 * sizeof(SInt16);
aBasicDescription.mBytesPerPacket       = sizeof(SInt16) * aBasicDescription.mFramesPerPacket;
aBasicDescription.mBytesPerFrame        = sizeof(SInt16) * aBasicDescription.mChannelsPerFrame

然后,为float生成一个对应的AudioStreamBasicDescription。

AudioStreamBasicDescription floatDesc = {0};
floatDesc.mFormatID = kAudioFormatLinearPCM;      
floatDesc.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked;
floatDesc.mBitsPerChannel = 8 * sizeof(float);
floatDesc.mFramesPerPacket = 1;                                          
floatDesc.mChannelsPerFrame = 1;           
floatDesc.mBytesPerPacket = sizeof(float) * floatDesc.mFramesPerPacket;                                                                            
floatDesc.mBytesPerFrame = sizeof(float) * floatDesc.mChannelsPerFrame;                                                                                   
floatDesc.mSampleRate = [controller samplerate];

做一些缓冲。

UInt32 intSize = inNumberFrames * sizeof(SInt16);
UInt32 floatSize = inNumberFrames * sizeof(float);
float *dataBuffer = (float *)calloc(numberOfAudioFramesIn, sizeof(float));

然后转换。(ioData 是包含 int 音频的 AudioBufferList)

AudioConverterRef converter;
OSStatus err = noErr;
err = AudioConverterNew(&aBasicDescription, &floatDesct, &converter);
//check for error here in "real" code
err = AudioConverterConvertBuffer(converter, intSize, ioData->mBuffers[0].mData, &floatSize, dataBuffer);
//check for error here in "real" code
//do stuff to dataBuffer, which now contains floats
//convert the floats back by running the conversion the other way
于 2011-12-01T03:58:15.743 回答
3

我正在做一些无关的事情,AudioUnits但我AudioStreamBasicDescription在 iOS 上使用。我可以通过指定使用浮动样本:

dstFormat.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsNonInterleaved | kAudioFormatFlagsNativeEndian | kLinearPCMFormatFlagIsPacked;

Learning Core Audio: A Hands-on Guide to Audio Programming for Mac and iOS 》一书对此很有帮助。

于 2013-11-21T02:38:48.650 回答
2

支持。

问题是你还必须设置kAudioFormatFlagIsNonInterleavedmFormatFlags如果在设置时不这样做kAudioFormatFlagIsFloat,会出现格式错误。

所以,你想在准备你的时候做这样的事情AudioStreamBasicDescription

streamDescription.mFormatFlags = kAudioFormatFlagIsFloat | 
                                 kAudioFormatFlagIsNonInterleaved;

至于为什么 iOS 需要这个,我不确定——我只是通过反复试验偶然发现它。

于 2012-11-21T03:47:23.650 回答
0

来自核心音频文档:

kAudioFormatFlagIsFloat
  Set for floating point, clear for integer.
  Available in iPhone OS 2.0 and later.
  Declared in CoreAudioTypes.h.

我对您的信息流了解不足,无法评论其 [in] 正确性。

于 2010-06-20T20:04:40.770 回答
0

您可以使用以下 ASBD 设置获得交错浮动 RemoteIO:

// STEREO_CHANNEL = 2, defaultSampleRate = 44100
AudioStreamBasicDescription const audioDescription = {
                    .mSampleRate        = defaultSampleRate,
                    .mFormatID          = kAudioFormatLinearPCM,
                    .mFormatFlags       = kAudioFormatFlagIsFloat,
                    .mBytesPerPacket    = STEREO_CHANNEL * sizeof(float),
                    .mFramesPerPacket   = 1,
                    .mBytesPerFrame     = STEREO_CHANNEL * sizeof(float),
                    .mChannelsPerFrame  = STEREO_CHANNEL,
                    .mBitsPerChannel    = 8 * sizeof(float),
                    .mReserved          = 0
                };

这对我有用。

于 2021-07-17T08:06:15.617 回答