我正在尝试编写一个可以进行数字信号处理并希望使其尽可能轻的应用程序。让我困惑一段时间的一件事就是各种设备的默认值可能是什么,这样我就可以避免在从缓冲区接收数据之前发生不需要的转换。我遇到了以下链接http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios这让我知道了什么我相信这是正确的道路。
我已经扩展了链接中的代码以在获取 ASBD(AudioStreamBasicDescription) 内容之前创建和激活 AVAudioSession,然后可以使用 AudioSession 请求各种“首选”设置以查看它们有什么影响。我还将用于列出 ASBD 值的 Apple 代码与上述链接中的代码结合起来。
下面的代码被放入通过选择 Single View Application 模板生成的 ViewController.m 文件中。请注意,您需要将 AudioToolbox.framework 和 CoreAudio.framework 添加到项目的链接框架和库中。
#import "ViewController.h"
@import AVFoundation;
@import AudioUnit;
@interface ViewController ()
@end
@implementation ViewController
- (void) printASBD:(AudioStreamBasicDescription) asbd {
char formatIDString[5];
UInt32 formatID = CFSwapInt32HostToBig (asbd.mFormatID);
bcopy (&formatID, formatIDString, 4);
formatIDString[4] = '\0';
NSLog (@" Sample Rate: %10.0f", asbd.mSampleRate);
NSLog (@" Format ID: %10s", formatIDString);
NSLog (@" Format Flags: %10X", (unsigned int)asbd.mFormatFlags);
NSLog (@" Bytes per Packet: %10d", (unsigned int)asbd.mBytesPerPacket);
NSLog (@" Frames per Packet: %10d", (unsigned int)asbd.mFramesPerPacket);
NSLog (@" Bytes per Frame: %10d", (unsigned int)asbd.mBytesPerFrame);
NSLog (@" Channels per Frame: %10d", (unsigned int)asbd.mChannelsPerFrame);
NSLog (@" Bits per Channel: %10d", (unsigned int)asbd.mBitsPerChannel);
}
- (void)viewDidLoad
{
[super viewDidLoad];
NSError *error = nil;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
// Get a reference to the AudioSession and activate it
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[audioSession setActive:YES error:&error];
// Then get RemoteIO AudioUnit and use it to get the content of the default AudioStreamBasicDescription
AudioUnit remoteIOUnit;
AudioComponentDescription audioComponentDesc = {0};
audioComponentDesc.componentType = kAudioUnitType_Output;
audioComponentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioComponentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
// Get component
AudioComponent audioComponent = AudioComponentFindNext(NULL, &audioComponentDesc);
AudioComponentInstanceNew(audioComponent, &remoteIOUnit);
// Read the stream format
size_t asbdSize = sizeof(AudioStreamBasicDescription);
AudioStreamBasicDescription asbd = {0};
AudioUnitGetProperty(remoteIOUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
0,
(void *)&asbd,
&asbdSize);
[self printASBD:asbd];
}
@end
我很想知道人们从其他实际硬件中获得的结果。请注意,代码已构建并部署到 IOS 7.1