我的应用程序有一个由 RemoteIO AudioUnit 框架调用的录音回调。AudioUnit 在与主线程不同的线程中调用回调,因此它没有自动释放池。在此回调中,我执行以下操作:
- 为记录的样本分配一个缓冲区。
- 调用
AudioUnitRender
以填充此缓冲区。 - 使用 .打开一个用于记录的文件
freopen
。 - 调用音频处理方法。
- 根据音频,使用
performSelectorOnMainThread
(发送到视图控制器)更新主线程上的 UI。
此外,我有一个调用函数testFilter()
来执行一些基准测试,我在视图控制器中调用它viewDidLoad
,在初始化音频会话之前,因此在第一次调用音频回调之前。在这个函数中,我分配了一个缓冲区(使用 malloc/free)并调用我上面提到的相同的音频处理方法。
现在,问题是(在设备上,而不是模拟器上):
- 如果我注释掉对 的调用
testFilter()
,我不会收到任何与内存泄漏相关的消息。 - 如果我确实 call
testFilter()
,我开始从音频回调中收到一堆消息(前 5 条消息是我的testFilter()
方法日志):
2011-01-20 23:05:10.358 TimeKeeper[389:307] 初始化缓冲区...
2011-01-20 23:05:10.693 TimeKeeper[389:307] 完成...
2011-01-20 23:05:10.696 TimeKeeper[389:307] 处理缓冲区...
2011-01-20 23:05:15.772 TimeKeeper[389:307] 完成...
2011-01-20 23:05:15.775 TimeKeeper[389:307] 经过时间 5.073843
2011-01-20 23:05:16.319 TimeKeeper[389:660f] * __NSAutoreleaseNoPool(): __NSCFData 类的对象 0x137330 自动释放,没有适当的池 - 只是泄漏
2011-01-20 23:05:16.327 TimeKeeper[389:660f] * __NSAutoreleaseNoPool(): __NSCFData 类的对象 0x1373a0 自动释放,没有适当的池 - 只是泄漏
等等。回调有一个不同的线程,可以在日志中看到。
为什么只有在我调用一个在音频会话初始化之前调用并完成的函数时才会出现这些警告?如何检测泄漏?
附录
相关方法:
void testFilter () {
#if TARGET_IPHONE_SIMULATOR == 0
freopen([@"/tmp/console.log" cStringUsingEncoding:NSASCIIStringEncoding],"a",stderr);
#endif
int bufSize = 2048;
int numsec=100;
OnsetDetector * onsetDetector = [[OnsetDetector alloc] init];
AudioSampleDataType *buffer = (AudioSampleDataType*) malloc (44100*numsec * sizeof(AudioSampleDataType)); // numsec seconds of audio @44100
AudioBufferList bufferList;
bufferList.mNumberBuffers = 1;
bufferList.mBuffers[0].mData = buffer;
bufferList.mBuffers[0].mDataByteSize = sizeof (AudioSampleDataType) * bufSize;
bufferList.mBuffers[0].mNumberChannels = 1;
//--- init buffer
NSLog(@"\n\n---***---***---");
NSLog(@"initializing buffer...");
for (int i = 0; i < 44100*numsec; ++i) {
*(buffer+i) = (AudioSampleDataType)rand();
}
NSLog(@"done...");
NSLog(@"processing buffer...");
CFAbsoluteTime t0 = CFAbsoluteTimeGetCurrent();
for (int i = 0; (i+1)*bufSize < 44100*numsec; ++i) {
bufferList.mBuffers[0].mData = buffer + i * bufSize;
[onsetDetector process:&bufferList];
}
CFAbsoluteTime t1 = CFAbsoluteTimeGetCurrent();
NSLog(@"done...");
NSLog(@"elapsed time %1.6f",(double)(t1-t0));
free(buffer);
[onsetDetector release];
}
和:
OSStatus recordingCallback (void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
AudioBufferList bufferList;
// redundant
SInt16 *buffer = (SInt16 *)malloc (sizeof (AudioSampleDataType) * inNumberFrames);
bufferList.mNumberBuffers = 1;
bufferList.mBuffers[0].mData = buffer;
bufferList.mBuffers[0].mDataByteSize = sizeof (AudioSampleDataType) * inNumberFrames;
bufferList.mBuffers[0].mNumberChannels = 1;
ioData = &bufferList;
// Obtain recorded samples
OSStatus status;
MainViewController *mainViewController = (MainViewController *)inRefCon;
AudioManager *audioManager = [mainViewController audioManager];
SampleManager *sampleManager = [audioManager sampleManager];
status = AudioUnitRender([audioManager audioUnit],
ioActionFlags,
inTimeStamp,
inBusNumber, //1
inNumberFrames,
ioData);
#if TARGET_IPHONE_SIMULATOR == 0
freopen([@"/tmp/console.log" cStringUsingEncoding:NSASCIIStringEncoding],"a",stdout);
#endif
// send to onset detector
SInt32 onset = [[audioManager onsetDetector] process:ioData];
if (onset > 0) {
NSLog(@"onset - %ld\n", sampleManager.recCnt + onset);
//--- updating the UI - must be done on main thread
[mainViewController performSelectorOnMainThread:@selector(tapButtonPressed) withObject:nil waitUntilDone:NO];
[mainViewController performSelectorOnMainThread:@selector(onsetLedOn) withObject:nil waitUntilDone:NO];
}
sampleManager.recCnt += inNumberFrames;
free(buffer);
return noErr;
}