5

I'm developing a collaborative audio recording platform for musicians (something like a cloud DAW married with GitHub). In a nutshell, a session (song) is made of a series of audio tracks, encoded in AAC and played through HTML5 <audio> elements. Each track is connected to the Web Audio API through a MediaElementAudioSourceNode and routed through a series of nodes (gain and pan, at the moment) until the destination. So far so good.

I am able to play them in sync, pause, stop and seek with no problems at all, and successfully implemented the usual mute, solo functionalities of the common DAW, as well as waveform visualization and navigation. This is the playback part.

As for the recording part, I connected the output from getUserMedia() to a MediaStreamAudioSourceNode, which is then routed to a ScriptProcessorNode that writes the recorded buffer to an array, using a web worker. When the recording process ends, the recorded buffer is written into a PCM wave file and uploaded to the server, but, at the same time, hooked up to a <audio> element for immediate playback (otherwise I would have to wait for the wav file to be uploaded to the server to be available).

Here is the problem: I can play the recorded track in perfect sync with the previous ones if I play them from the beginning, but I can't seek properly. If I change the currentTime property of the newly recorded track, it becomes messy and terribly out of sync — I repeat that this happens only when the "local" track is added, as the other tracks behave just fine when I change their position.

Does anyone have any idea of what may be causing this? Is there any other useful information I can provide?

Thank you in advance.

4

1 回答 1

2

从根本上说,不能保证元素会正确同步。如果您真的希望音频同步,则必须将音频文件加载到 AudioBuffers 并使用 BufferSourceNodes 播放它们。

你会发现在一些相对简单的情况下你可以让它们同步——但它不一定能跨设备和操作系统工作,一旦你开始尝试寻找,正如你发现的那样,它就会分崩离析。将下载、解码和播放打包为一个步骤的方式不适合同步。

于 2013-08-10T17:04:55.527 回答