0

我在我的项目中使用 WaveSurfer js,我们可以在其中编辑音频。为此,我使用区域插件。

当用户单击按钮完成时,我想将结果导出为音频文件(mp3/wav)

为了获得用户选择他的音频的音频峰值,我这样做:

var json = wavesurfer.backend.getPeaks(960,  wavesurfer.regions.list["wavesurfer_j99v7ophop8"].start, wavesurfer.regions.list["wavesurfer_j99v7ophop8"].end)

这可行,但我想将其导出为音频文件而不是 json

提前致谢

4

1 回答 1

3

看到这个答案,你可以从缓冲区创建一个音频 Wav 文件。

所以代码和方法可能是这样的:

方法:

// Convert a audio-buffer segment to a Blob using WAVE representation
// The returned Object URL can be set directly as a source for an Auido element.
// (C) Ken Fyrstenberg / MIT license
function bufferToWave(abuffer, offset, len) {

  var numOfChan = abuffer.numberOfChannels,
      length = len * numOfChan * 2 + 44,
      buffer = new ArrayBuffer(length),
      view = new DataView(buffer),
      channels = [], i, sample,
      pos = 0;

  // write WAVE header
  setUint32(0x46464952);                         // "RIFF"
  setUint32(length - 8);                         // file length - 8
  setUint32(0x45564157);                         // "WAVE"

  setUint32(0x20746d66);                         // "fmt " chunk
  setUint32(16);                                 // length = 16
  setUint16(1);                                  // PCM (uncompressed)
  setUint16(numOfChan);
  setUint32(abuffer.sampleRate);
  setUint32(abuffer.sampleRate * 2 * numOfChan); // avg. bytes/sec
  setUint16(numOfChan * 2);                      // block-align
  setUint16(16);                                 // 16-bit (hardcoded in this demo)

  setUint32(0x61746164);                         // "data" - chunk
  setUint32(length - pos - 4);                   // chunk length

  // write interleaved data
  for(i = 0; i < abuffer.numberOfChannels; i++)
    channels.push(abuffer.getChannelData(i));

  while(pos < length) {
    for(i = 0; i < numOfChan; i++) {             // interleave channels
      sample = Math.max(-1, Math.min(1, channels[i][offset])); // clamp
      sample = (0.5 + sample < 0 ? sample * 32768 : sample * 32767)|0; // scale to 16-bit signed int
      view.setInt16(pos, sample, true);          // update data chunk
      pos += 2;
    }
    offset++                                     // next source sample
  }

  // create Blob
  return (URL || webkitURL).createObjectURL(new Blob([buffer], {type: "audio/wav"}));

  function setUint16(data) {
    view.setUint16(pos, data, true);
    pos += 2;
  }

  function setUint32(data) {
    view.setUint32(pos, data, true);
    pos += 4;
  }
}

用法:

let originalBuffer = bufferToWave(app.engine.wavesurfer.backend.buffer, 0, app.engine.wavesurfer.backend.buffer.length);
于 2020-06-08T11:04:11.707 回答