1

我一直在尝试在我的 SpriteKit 游戏中播放音乐,并AVAudioPlayerNode通过AVAudioPCMBuffers. 每次我导出我的 OS X 项目时,它都会崩溃并给我一个关于音频播放的错误。在过去的 24 小时内我的头撞到墙上后,我决定重新观看WWDC 会话 501(见 54:17)。我对这个问题的解决方案是演示者使用的方法,即将缓冲区的帧分成更小的部分,以分解正在读取的音频文件。

NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;

AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
    AVAudioFramePosition readPosition = audioFile.framePosition;
    if (![audioFile readIntoBuffer: readBuffer error: &error])
        return NO;
    if (readBuffer.frameLength == 0) //end of file reached
        break;
}

我目前的问题是播放器只播放读入缓冲区的最后一帧。我正在播放的音乐只有 2 分钟长。显然,这太长了,不能直接读入缓冲区。每次readIntoBuffer:在循环内调用方法时是否都会覆盖缓冲区?我对这些东西很陌生……我怎样才能播放整个文件?

如果我不能让它工作,那么在多个SKScenes 上播放音乐(2 个不同的文件)的好方法是什么?

4

1 回答 1

2

This is the solution that I came up with. It's still not perfect, but hopefully it will help someone who is in the same predicament that I've found myself in. I created a singleton class to handle this job. One improvement that can be made in the future is to only load sound effects and music files needed for a particular SKScene at the time they are needed. I had so many issues with this code that I don't want to mess with it now. Currently, I don't have too many sounds, so it's not using an excessive amount of memory.

Overview
My strategy was the following:

  1. Store the audio file names for the game in a plist
  2. Read from that plist and create two dictionaries (one for music and one for short sound effects)
  3. The sound effect dictionary is composed of a AVAudioPCMBuffer and a AVAudioPlayerNode for each of the sounds
  4. The music dictionary is compose of an array of AVAudioPCMBuffers, an array of timestamps for when those buffers should be played in queue, a AVAudioPlayerNode and the sample rate of the original audio file

    • The sample rate is necessary for figuring out the time at which each buffer should be played (you'll see the calculations done in code)
  5. Create an AVAudioEngine and get the main mixer from the engine and attach all AVAudioPlayerNodes to the mixer (as per usual)

  6. Play sound effects or music using their various methods
    • sound effect playing is straightforward...call method -(void) playSfxFile:(NSString*)file; and it plays a sound
    • for music, I just couldn't find a good solution without invoking the help of the scene trying to play the music. The scene will call -(void) playMusicFile:(NSString*)file;and it will schedule the buffers to play in order that they were created. I couldn't find a good way to get the music to repeat once completed within my AudioEngine class so I decided to get the scene to check in its update: method whether or not the music was playing for a particular file and if not, play it again (not a very slick solution, but it works)

AudioEngine.h

#import <Foundation/Foundation.h>

@interface AudioEngine : NSObject

+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;

@end

AudioEngine.m

#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)

@interface AudioEngine()

@property AVAudioEngine *engine;
@property AVAudioMixerNode *mixer;

@property NSMutableDictionary *musicDict;
@property NSMutableDictionary *sfxDict;

@property NSString *audioInfoPList;

@property float musicVolumePercent;
@property float sfxVolumePercent;
@property float fadeVolume;
@property float timerCount;

@end

@implementation AudioEngine

int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = @"player";
static NSString * const MUSIC_BUFFERS = @"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = @"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = @"sampleRate";

static NSString * const SFX_BUFFER = @"buffer";
static NSString * const SFX_PLAYER = @"player";

+(instancetype) sharedData {
    static AudioEngine *sharedInstance = nil;

    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        sharedInstance = [[self alloc] init];
        [sharedInstance startEngine];
    });

    return sharedInstance;
}

-(instancetype) init {
    if (self = [super init]) {
        _engine = [[AVAudioEngine alloc] init];
        _mixer = [_engine mainMixerNode];

        _audioInfoPList = [[NSBundle mainBundle] pathForResource:@"AudioInfo" ofType:@"plist"]; //open a plist called AudioInfo.plist

        [self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
        [self initMusic];
        [self initSfx];
    }
    return self;
}

//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
    _musicDict = [NSMutableDictionary dictionary];

    _audioInfoPList = [[NSBundle mainBundle] pathForResource: @"AudioInfo" ofType: @"plist"];
    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];

    for (NSString *musicFileName in audioInfoData[@"music"]) {
        [self loadMusicIntoBuffer:musicFileName];
        AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
        [_engine attachNode:player];

        AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
        [_engine connect:player to:_mixer format:buffer.format];
        [_musicDict[musicFileName] setObject:player forKey:@"player"];
    }
}

//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
    NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:@"aif"];
    //NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"aif"]];
    NSAssert(audioFileURL, @"Error creating URL to audio file");
    NSError *error = nil;
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);

    AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
    float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
    [_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
    [_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];

    NSMutableArray *buffers = [NSMutableArray array];
    NSMutableArray *framePositions = [NSMutableArray array];

    const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
    while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
        [framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
        AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
        if (![audioFile readIntoBuffer:readBuffer error:&error]) {
            NSLog(@"failed to read audio file: %@", error);
            return;
        }
        if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
            break;
        }
        [buffers addObject:readBuffer];
    }

    [_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
    [_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}

-(void) initSfx {
    _sfxDict = [NSMutableDictionary dictionary];

    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];

    for (NSString *sfxFileName in audioInfoData[@"sfx"]) {
        AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
        [_engine attachNode:player];

        [self loadSoundIntoBuffer:sfxFileName];
        AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
        [_engine connect:player to:_mixer format:buffer.format];
        [_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
    }
}

//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
    NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"mp3"]];
    NSAssert(audioFileURL, @"Error creating URL to audio file");
    NSError *error = nil;
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);

    AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
    [audioFile readIntoBuffer:readBuffer error:&error];

    [_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
    [_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}

-(void)startEngine {
    [_engine startAndReturnError:nil];
}

-(void) playSfxFile:(NSString*)file {
    AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:@"player"];
    AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
    [player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
    [player setVolume:1.0];
    [player setVolume:_sfxVolumePercent];
    [player play];
}

-(void) playMusicFile:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];

    if ([player isPlaying] == NO) {
        NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];

        double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];


        for (int i = 0; i < [buffers count]; i++) {
            long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
            AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];

            AVAudioPCMBuffer *buffer  = [buffers objectAtIndex:i];
            [player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
                if (i == [buffers count] - 1) {
                    [player stop];
                }
            }];
            [player setVolume:_musicVolumePercent];
            [player play];
        }
    }
}

-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
    if ([file isEqualToString:@"menuscenemusic"]) {
        AVAudioPlayerNode *player = [_musicDict[@"levelscenemusic"] objectForKey:MUSIC_PLAYER];
        [player stop];
    }
    else {
        AVAudioPlayerNode *player = [_musicDict[@"menuscenemusic"] objectForKey:MUSIC_PLAYER];
        [player stop];
    }
}

//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];

    if ([player isPlaying]) {
        _timerCount = FADE_ITERATIONS;
        _fadeVolume = _musicVolumePercent;
        [self fadeOutMusicForPlayer:player]; //fade out the music
    }
}

//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
    [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(handleTimer:) userInfo:player repeats:YES];
}

//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
    AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
    if (_timerCount > 0) {
        _timerCount--;
        AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
        _fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
        [player setVolume:_fadeVolume];
    }
    else {
        [player stop];
        [player setVolume:_musicVolumePercent];
        [timer invalidate];
    }
}

-(void) pauseMusic:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    if ([player isPlaying]) {
        [player pause];
    }
}

-(void) unpauseMusic:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    [player play];
}

//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
    NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:@"musicVolume"];
    _musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
    componentsJoinedByString:@""] floatValue] / 100;
    NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:@"sfxVolume"];
    _sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
    componentsJoinedByString:@""] floatValue] / 100;

    //immediately sets music to new volume
    for (NSString *file in [_musicDict allKeys]) {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
        [player setVolume:_musicVolumePercent];
    }
}

-(bool) isPlayingMusic:(NSString *)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    if ([player isPlaying])
        return YES;
    return NO;
}

@end
于 2015-08-24T15:05:27.883 回答