1

我有一个奇怪的问题,使用 AVAudioPlayer 在后台在 iPhone 上播放声音文件(wav 文件)。我正在使用以下代码:

        AVAudioPlayer* audioplayer;
        NSError* error;

        audioplayer = [[AVAudioPlayer alloc] initWithData:soundfile error:&error];
        if (error) {
            NSLog(@"an error occured while init audioplayer...");
            NSLog(@"%@", [error localizedDescription]);
        }
        audioplayer.currentTime = 0;
        if (![audioplayer prepareToPlay])
            NSLog(@"could not preparetoPlay");

        audioplayer.volume = 1.0;

        [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
        [[AVAudioSession sharedInstance] setActive: YES error: &error];

        if (![audioplayer play])
            NSLog(@"could not play sound");

        audioplayer.delegate = [myApp sharedInstance];

当应用程序处于前台时,这可以正常工作。但是,当将应用程序移至后台时 [audioplayer prepareToPlay] 返回 NO。

这发生在 AND 没有添加到“必需的背景模式”中的“应用程序播放音频”的情况下。有没有办法从 [audioplayer prepareToPlay] 获得更精确的错误报告?或者你有什么暗示我做错了什么或忘记了什么?

4

3 回答 3

1

您需要在准备 AVAudioPlayer 实例之前初始化您的音频会话。理想情况下,将音频会话调用移至应用程序委托的 didFinishLaunchingWithOptions 方法。

于 2013-04-29T06:17:37.257 回答
0

我不完全确定这可以单独使用 AVFoundation 来实现,您可能需要使用 AudioUnit 框架并创建一个流。将 .WAV 文件的内容发送到音频缓冲区应该比较简单。

这就是我在 Piti Piti Pa 中的做法。另一个好处是您可以更好地控制音频的延迟,以便同步音频和视频动画(使用蓝牙时更明显)。

这是我用来初始化音频单元的代码:

+(BOOL)_createAudioUnitInstance
{
    // Describe audio component
    AudioComponentDescription desc;
    desc.componentType = kAudioUnitType_Output;
    desc.componentSubType = kAudioUnitSubType_RemoteIO;
    desc.componentFlags = 0;
    desc.componentFlagsMask = 0;
    desc.componentManufacturer = kAudioUnitManufacturer_Apple;
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);

    // Get audio units
    OSStatus status = AudioComponentInstanceNew(inputComponent, &_audioUnit);
    [self _logStatus:status step:@"instantiate"];
    return (status == noErr );
}

+(BOOL)_setupAudioUnitOutput
{
    UInt32 flag = 1;
    OSStatus status = AudioUnitSetProperty(_audioUnit,
                              kAudioOutputUnitProperty_EnableIO,
                              kAudioUnitScope_Output,
                              _outputAudioBus,
                              &flag,
                              sizeof(flag));
    [self _logStatus:status step:@"set output bus"];
    return (status == noErr );
}

+(BOOL)_setupAudioUnitFormat
{
    AudioStreamBasicDescription audioFormat = {0};
    audioFormat.mSampleRate         = 44100.00;
    audioFormat.mFormatID           = kAudioFormatLinearPCM;
    audioFormat.mFormatFlags        = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    audioFormat.mFramesPerPacket    = 1;
    audioFormat.mChannelsPerFrame   = 2;
    audioFormat.mBitsPerChannel     = 16;
    audioFormat.mBytesPerPacket     = 4;
    audioFormat.mBytesPerFrame      = 4;

    OSStatus status = AudioUnitSetProperty(_audioUnit,
                                       kAudioUnitProperty_StreamFormat,
                                       kAudioUnitScope_Input,
                                       _outputAudioBus,
                                       &audioFormat,
                                       sizeof(audioFormat));
    [self _logStatus:status step:@"set audio format"];
    return (status == noErr );
}


+(BOOL)_setupAudioUnitRenderCallback
{
    AURenderCallbackStruct audioCallback;
    audioCallback.inputProc = playbackCallback;
    audioCallback.inputProcRefCon = (__bridge void *)(self);
    OSStatus status = AudioUnitSetProperty(_audioUnit,
                                       kAudioUnitProperty_SetRenderCallback,
                                       kAudioUnitScope_Global,
                                       _outputAudioBus,
                                       &audioCallback,
                                       sizeof(audioCallback));
    [self _logStatus:status step:@"set render callback"];
    return (status == noErr);
}


+(BOOL)_initializeAudioUnit
{
    OSStatus status = AudioUnitInitialize(_audioUnit);
    [self _logStatus:status step:@"initialize"];
    return (status == noErr);
}

+(void)start
{
    [self clearFeeds];
    [self _startAudioUnit];
}

+(void)stop
{
    [self _stopAudioUnit];
}

+(BOOL)_startAudioUnit
{
    OSStatus status = AudioOutputUnitStart(_audioUnit);
    [self _logStatus:status step:@"start"];
    return (status == noErr);
}

+(BOOL)_stopAudioUnit
{
    OSStatus status = AudioOutputUnitStop(_audioUnit);
    [self _logStatus:status step:@"stop"];
    return (status == noErr);
}

+(void)_logStatus:(OSStatus)status step:(NSString *)step
{
    if( status != noErr )
    {
        NSLog(@"AudioUnit failed to %@, error: %d", step, (int)status);
    }
}

#pragma mark - Mixer

static OSStatus playbackCallback(void *inRefCon,
                             AudioUnitRenderActionFlags *ioActionFlags,
                             const AudioTimeStamp *inTimeStamp,
                             UInt32 inBusNumber,
                             UInt32 inNumberFrames,
                             AudioBufferList *ioData) {

    @autoreleasepool {
        AudioBuffer *audioBuffer = ioData->mBuffers;

        _lastPushedFrame = _nextFrame;
        [SIOAudioMixer _generateAudioFrames:inNumberFrames into:audioBuffer->mData];
    }
    return noErr;
}

现在您只需要提取 .Wav 文件的内容(如果将它们导出为 RAW 格式会更容易)并通过回调将其发送到缓冲区。

我希望这会有所帮助!

于 2015-05-21T03:37:32.023 回答
0

在 AppDelegate 中,像这样设置 AVAudioSession 类别:(Swift 2)

    do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: AVAudioSessionCategoryOptions.MixWithOthers)
    }catch{
        self.fireAnAlert("Set Category Failed", theMessage: "Failed to set AVAudioSession Category")
    }

将选项设置为“与其他人混合”是重要的部分!

然后,无论您要在哪里播放声音,请确保您调用 beginReceivingRemoteControlEvents,然后将 AVAudioSession 设置为活动状态,如下所示:

    do{

        UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
        try AVAudioSession.sharedInstance().setActive(true)

    }catch{

        let e = error as NSError

        self.appDelegate?.fireAnAlert("Error", theMessage: "\(e.localizedDescription)")
    }
于 2015-08-21T17:26:55.817 回答