0

我正在 xcode 上开发一个 ios 音频应用程序,我正在尝试使用我录制的 2 个音频文件 - 它们同时播放并将其导出到一个音频文件。我所能做的就是将 2 个音频文件合并为一个,但是 2 个音频一个接一个地播放,并且不同步。有谁知道我该如何解决?谢谢

4

3 回答 3

1

试试 Apple 的MixerHost示例应用。

于 2012-06-24T14:18:27.057 回答
1

你应该看看这个 AAC 转换 (http://atastypixel.com/blog/easy-aac-compressed-audio-conversion-on-ios/)。它超级有用。

您可能需要考虑的另一件事......组合两个音频信号就像将样本添加在一起一样简单。所以你可以做的是:

打开两个录音并为每个包含音频样本的录音获取一个数组。

创建一个 for() 循环,添加每个样本并将其放入输出数组

for(int i = 0; i<numberOfSamples; i++) {
   exportBuffer[i] = firstTrack[i] + secondTrack[i];
}

然后将 exportBuffer 写入 m4a 文件。

此代码仅在两个文件的确切长度相同时才有效,因此请根据您的需要进行调整。如果您已到达其中一个数组的末尾,您将需要添加一个触发条件。在这种情况下,只需添加 0。

于 2012-06-25T06:21:09.987 回答
-1

/* 如果您已经保存了录制的音频文件,请执行此方法 */

-(void)mixAudio{
AVMutableComposition *composition = [[AVMutableComposition alloc] init];

AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne  =[[NSBundle mainBundle]pathForResource:@"RecordAudio1" ofType:@"wav"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [tracks objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne1  =[[NSBundle mainBundle]pathForResource:@"RecordAudio2" ofType:@"wav"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [tracks1 objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset1.duration) ofTrack:clipAudioTrack1 atTime: kCMTimeZero error:nil];

  AVAssetExportSession *exportSession = [AVAssetExportSession
                                           exportSessionWithAsset:composition
                                           presetName:AVAssetExportPresetAppleM4A];
    if (nil == exportSession) return NO;

NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:@"combined10.m4a"];
//NSLog(@"Output file path - %@",soundOneNew);

// configure export session  output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type

// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{

    if (AVAssetExportSessionStatusCompleted == exportSession.status) {
        NSLog(@"AVAssetExportSessionStatusCompleted");
    } else if (AVAssetExportSessionStatusFailed == exportSession.status) {
        // a failure may happen because of an event out of your control
        // for example, an interruption like a phone call comming in
        // make sure and handle this case appropriately
        NSLog(@"AVAssetExportSessionStatusFailed");
    } else {
        NSLog(@"Export Session Status: %d", exportSession.status);
    }
}];
}
于 2014-05-02T14:20:55.717 回答