1

我想在我的 Apple Watch 应用中循环播放本地音频文件。目前我正在使用 AVAudioPlayerNode 和 AVAudioEngine 效果很好,但我不知道如何循环播放声音。

我注意到我可以使用 AVAudioPlayer,它有方便的“numberOfLoops”,但由于某种原因,AVAudioPlayer 不能在手表上工作。我不知道为什么。

这是我当前播放声音的代码:

_audioPlayer = [[AVAudioPlayerNode alloc] init];
_audioEngine = [[AVAudioEngine alloc] init];
[_audioEngine attachNode:_audioPlayer];

AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_audioEngine connect:_audioPlayer to:_audioEngine.mainMixerNode format:stereoFormat];

if (!_audioEngine.isRunning) {
    NSError* error;
    [_audioEngine startAndReturnError:&error];
}

NSError *error;
NSBundle* appBundle = [NSBundle mainBundle];
NSURL *url = [NSURL fileURLWithPath:[appBundle pathForResource:@"FILE_NAME" ofType:@"mp3"]];
AVAudioFile *asset = [[AVAudioFile alloc] initForReading:url error:&error];

[_audioPlayer scheduleFile:asset atTime:nil completionHandler:nil];
[_audioPlayer play];

这是我尝试用于 AVAudioPlayer 的代码,但不起作用:

NSError *audioError;
AVAudioPlayer* player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"FILE_NAME" ofType:@"mp3"]] error:&audioError];
player.numberOfLoops = MAXFLOAT;
player.delegate = self;
[player play];

我正在使用 WatchKit 5.0(+)。

4

2 回答 2

2

您可以AVAudioFile通过递归调度它来循环您的:

__block __weak void (^weakSheduleFile)(void);
void (^scheduleFile)(void);

weakSheduleFile = scheduleFile = ^{ [self->_audioPlayer scheduleFile:asset atTime:nil completionHandler:weakSheduleFile]; };

scheduleFile();

我不确定这是否会是一个无缝循环。如果不是,您可以尝试始终安排两个文件:

scheduleFile();
scheduleFile();
于 2018-11-19T22:25:16.723 回答
0

如果您的音频文件适合内存,您可以AVAudioBuffer使用AVAudioPlayerNodeBufferLoops选项安排播放(NB仅在模拟器上测试!):

AVAudioFormat *outputFormat = [_audioPlayer outputFormatForBus:0];

__block AVAudioPCMBuffer *srcBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:asset.processingFormat frameCapacity:(AVAudioFrameCount)asset.length];

if (![asset readIntoBuffer:srcBuffer error:&error]) {
    NSLog(@"Read error: %@", error);
    abort();
}

AVAudioPCMBuffer *dstBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)asset.length];

AVAudioConverter *converter = [[AVAudioConverter alloc] initFromFormat:srcBuffer.format toFormat:dstBuffer.format];
AVAudioConverterOutputStatus status = [converter convertToBuffer:dstBuffer error:&error withInputFromBlock:^AVAudioBuffer * _Nullable(AVAudioPacketCount inNumberOfPackets, AVAudioConverterInputStatus * _Nonnull outStatus) {
    if (srcBuffer) {
        AVAudioBuffer *result = srcBuffer;
        srcBuffer = NULL;
        *outStatus = AVAudioConverterInputStatus_HaveData;
        return result;
    } else {
        *outStatus = AVAudioConverterInputStatus_EndOfStream;
        return NULL;
    }
}];

assert(status != AVAudioConverterOutputStatus_Error);

[_audioPlayer scheduleBuffer:dstBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil];
[_audioPlayer play];
于 2018-11-19T13:48:12.863 回答