0

我正在尝试编写一个可以进行数字信号处理并希望使其尽可能轻的应用程序。让我困惑一段时间的一件事就是各种设备的默认值可能是什么,这样我就可以避免在从缓冲区接收数据之前发生不需要的转换。我遇到了以下链接http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios这让我知道了什么我相信这是正确的道路。

我已经扩展了链接中的代码以在获取 ASBD(AudioStreamBasicDescription) 内容之前创建和激活 AVAudioSession,然后可以使用 AudioSession 请求各种“首选”设置以查看它们有什么影响。我还将用于列出 ASBD 值的 Apple 代码与上述链接中的代码结合起来。

下面的代码被放入通过选择 Single View Application 模板生成的 ViewController.m 文件中。请注意,您需要将 AudioToolbox.framework 和 CoreAudio.framework 添加到项目的链接框架和库中。

#import "ViewController.h"
@import AVFoundation;
@import AudioUnit;

@interface ViewController ()

@end

@implementation ViewController

- (void) printASBD:(AudioStreamBasicDescription) asbd {
    char formatIDString[5];
    UInt32 formatID = CFSwapInt32HostToBig (asbd.mFormatID);
    bcopy (&formatID, formatIDString, 4);
    formatIDString[4] = '\0';

    NSLog (@"  Sample Rate:         %10.0f",  asbd.mSampleRate);
    NSLog (@"  Format ID:           %10s",    formatIDString);
    NSLog (@"  Format Flags:        %10X",    (unsigned int)asbd.mFormatFlags);
    NSLog (@"  Bytes per Packet:    %10d",    (unsigned int)asbd.mBytesPerPacket);
    NSLog (@"  Frames per Packet:   %10d",    (unsigned int)asbd.mFramesPerPacket);
    NSLog (@"  Bytes per Frame:     %10d",    (unsigned int)asbd.mBytesPerFrame);
    NSLog (@"  Channels per Frame:  %10d",    (unsigned int)asbd.mChannelsPerFrame);
    NSLog (@"  Bits per Channel:    %10d",    (unsigned int)asbd.mBitsPerChannel);
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    NSError *error = nil;
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];

    // Get a reference to the AudioSession and activate it
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
    [audioSession setActive:YES error:&error];


    // Then get RemoteIO AudioUnit and use it to get the content of the default AudioStreamBasicDescription
    AudioUnit remoteIOUnit;

    AudioComponentDescription audioComponentDesc = {0};
    audioComponentDesc.componentType = kAudioUnitType_Output;
    audioComponentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
    audioComponentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent audioComponent = AudioComponentFindNext(NULL, &audioComponentDesc);
    AudioComponentInstanceNew(audioComponent, &remoteIOUnit);

    // Read the stream format
    size_t asbdSize = sizeof(AudioStreamBasicDescription);
    AudioStreamBasicDescription asbd = {0};
    AudioUnitGetProperty(remoteIOUnit,
                         kAudioUnitProperty_StreamFormat,
                         kAudioUnitScope_Output,
                         0,
                         (void *)&asbd,
                         &asbdSize);

    [self printASBD:asbd];
}

@end

我很想知道人们从其他实际硬件中获得的结果。请注意,代码已构建并部署到 IOS 7.1

4

1 回答 1

0

格式标志是:

kAudioFormatFlagIsFloat                  = (1 << 0),    // 0x1
kAudioFormatFlagIsBigEndian              = (1 << 1),    // 0x2
kAudioFormatFlagIsSignedInteger          = (1 << 2),    // 0x4
kAudioFormatFlagIsPacked                 = (1 << 3),    // 0x8
kAudioFormatFlagIsAlignedHigh            = (1 << 4),    // 0x10
kAudioFormatFlagIsNonInterleaved         = (1 << 5),    // 0x20
kAudioFormatFlagIsNonMixable             = (1 << 6),    // 0x40
kAudioFormatFlagsAreAllClear             = (1 << 31),

我在 iPad 4 上获得的结果如下:

Sample Rate:                  0
Format ID:                 lpcm
Format Flags:                29
Bytes per Packet:             4
Frames per Packet:            1
Bytes per Frame:              4
Channels per Frame:           2
Bits per Channel:            32

我猜 lpcm(线性脉冲编码调制)并不奇怪,格式标志 = x'29'kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked以及每通道 32 位似乎表明预期的 8.24“固定浮点”。

于 2014-04-22T20:25:18.267 回答