4

笔记:

如果您最终来到这里,您可能想看看shaka-player和随附的shaka-streamer。用它。除非你真的必须这样做,否则不要自己实现。

我现在尝试了很长一段时间,以便能够在 Chrome、Firefox、Safari 等上播放音轨,但我一直在碰壁。我目前的问题是我无法在碎片化的 MP4(或 MP3)中寻找。

目前,我正在将 MP3 等音频文件转换为分段 MP4 (fMP4) 并将它们逐块发送到客户端。我所做的是定义一个CHUNK_DURACTION_SEC(以秒为单位的块持续时间)并计算一个块大小,如下所示:

chunksTotal = Math.ceil(this.track.duration / CHUNK_DURATION_SEC);
chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);

有了这个,我对音频文件进行了分区,并且可以完全跳转chunkSize到每个块的多个字节中获取它:

-----------------------------------------
| chunk 1 | chunk 2 |   ...   | chunk n |
-----------------------------------------

音频文件如何转换为 fMP4

ffmpeg -i input.mp3 -acodec aac -b:a 256k -f mp4 \
       -movflags faststart+frag_every_frame+empty_moov+default_base_moof \
        output.mp4

这似乎适用于 Chrome 和 Firefox(到目前为止)。

如何附加块

在遵循这个例子之后,并意识到它根本不像这里解释的那样工作,我把它扔掉了,从头开始。不幸的是没有成功。它仍然无法正常工作。

以下代码应该从头到尾播放曲目。但是,我也需要能够寻求。到目前为止,这根本行不通。Seeking 将在seeking事件触发后停止音频。

编码

/* Desired chunk duration in seconds. */
const CHUNK_DURATION_SEC = 20;

const AUDIO_EVENTS = [
  'ended',
  'error',
  'play',
  'playing',
  'seeking',
  'seeked',
  'pause',
  'timeupdate',
  'canplay',
  'loadedmetadata',
  'loadstart',
  'updateend',
];


class ChunksLoader {

  /** The total number of chunks for the track. */
  public readonly chunksTotal: number;

  /** The length of one chunk in bytes */
  public readonly chunkSize: number;

  /** Keeps track of requested chunks. */
  private readonly requested: boolean[];

  /** URL of endpoint for fetching audio chunks. */
  private readonly url: string;

  constructor(
    private track: Track,
    private sourceBuffer: SourceBuffer,
    private logger: NGXLogger,
  ) {

    this.chunksTotal = Math.ceil(this.track.duration / CHUNK_DURATION_SEC);
    this.chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);

    this.requested = [];
    for (let i = 0; i < this.chunksTotal; i++) {
      this.requested[i] = false;
    }

    this.url = `${environment.apiBaseUrl}/api/tracks/${this.track.id}/play`;
  }

  /**
   * Fetch the first chunk.
   */
  public begin() {
    this.maybeFetchChunk(0);
  }

  /**
   * Handler for the "timeupdate" event. Checks if the next chunk should be fetched.
   *
   * @param currentTime
   *  The current time of the track which is currently played.
   */
  public handleOnTimeUpdate(currentTime: number) {

    const nextChunkIndex = Math.floor(currentTime / CHUNK_DURATION_SEC) + 1;
    const hasAllChunks = this.requested.every(val => !!val);

    if (nextChunkIndex === (this.chunksTotal - 1) && hasAllChunks) {
      this.logger.debug('Last chunk. Calling mediaSource.endOfStream();');
      return;
    }

    if (this.requested[nextChunkIndex] === true) {
      return;
    }

    if (currentTime < CHUNK_DURATION_SEC * (nextChunkIndex - 1 + 0.25)) {
      return;
    }

    this.maybeFetchChunk(nextChunkIndex);
  }

  /**
   * Fetches the chunk if it hasn't been requested yet. After the request finished, the returned
   * chunk gets appended to the SourceBuffer-instance.
   *
   * @param chunkIndex
   *  The chunk to fetch.
   */
  private maybeFetchChunk(chunkIndex: number) {

    const start = chunkIndex * this.chunkSize;
    const end = start + this.chunkSize - 1;

    if (this.requested[chunkIndex] == true) {
      return;
    }

    this.requested[chunkIndex] = true;

    if ((end - start) == 0) {
      this.logger.warn('Nothing to fetch.');
      return;
    }

    const totalKb = ((end - start) / 1000).toFixed(2);
    this.logger.debug(`Starting to fetch bytes ${start} to ${end} (total ${totalKb} kB). Chunk ${chunkIndex + 1} of ${this.chunksTotal}`);

    const xhr = new XMLHttpRequest();
    xhr.open('get', this.url);
    xhr.setRequestHeader('Authorization', `Bearer ${AuthenticationService.getJwtToken()}`);
    xhr.setRequestHeader('Range', 'bytes=' + start + '-' + end);
    xhr.responseType = 'arraybuffer';
    xhr.onload = () => {
      this.logger.debug(`Range ${start} to ${end} fetched`);
      this.logger.debug(`Requested size:        ${end - start + 1}`);
      this.logger.debug(`Fetched size:          ${xhr.response.byteLength}`);
      this.logger.debug('Appending chunk to SourceBuffer.');
      this.sourceBuffer.appendBuffer(xhr.response);
    };
    xhr.send();
  };

}

export enum StreamStatus {
  NOT_INITIALIZED,
  INITIALIZING,
  PLAYING,
  SEEKING,
  PAUSED,
  STOPPED,
  ERROR
}

export class PlayerState {
  status: StreamStatus = StreamStatus.NOT_INITIALIZED;
}


/**
 *
 */
@Injectable({
  providedIn: 'root'
})
export class MediaSourcePlayerService {

  public track: Track;

  private mediaSource: MediaSource;

  private sourceBuffer: SourceBuffer;

  private audioObj: HTMLAudioElement;

  private chunksLoader: ChunksLoader;

  private state: PlayerState = new PlayerState();

  private state$ = new BehaviorSubject<PlayerState>(this.state);

  public stateChange = this.state$.asObservable();

  private currentTime$ = new BehaviorSubject<number>(null);

  public currentTimeChange = this.currentTime$.asObservable();

  constructor(
    private httpClient: HttpClient,
    private logger: NGXLogger
  ) {
  }

  get canPlay() {
    const state = this.state$.getValue();
    const status = state.status;
    return status == StreamStatus.PAUSED;
  }

  get canPause() {
    const state = this.state$.getValue();
    const status = state.status;
    return status == StreamStatus.PLAYING || status == StreamStatus.SEEKING;
  }

  public playTrack(track: Track) {
    this.logger.debug('playTrack');
    this.track = track;
    this.startPlayingFrom(0);
  }

  public play() {
    this.logger.debug('play()');
    this.audioObj.play().then();
  }

  public pause() {
    this.logger.debug('pause()');
    this.audioObj.pause();
  }

  public stop() {
    this.logger.debug('stop()');
    this.audioObj.pause();
  }

  public seek(seconds: number) {
    this.logger.debug('seek()');
    this.audioObj.currentTime = seconds;
  }

  private startPlayingFrom(seconds: number) {
    this.logger.info(`Start playing from ${seconds.toFixed(2)} seconds`);
    this.mediaSource = new MediaSource();
    this.mediaSource.addEventListener('sourceopen', this.onSourceOpen);

    this.audioObj = document.createElement('audio');
    this.addEvents(this.audioObj, AUDIO_EVENTS, this.handleEvent);
    this.audioObj.src = URL.createObjectURL(this.mediaSource);

    this.audioObj.play().then();
  }

  private onSourceOpen = () => {

    this.logger.debug('onSourceOpen');

    this.mediaSource.removeEventListener('sourceopen', this.onSourceOpen);
    this.mediaSource.duration = this.track.duration;

    this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"');
    // this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mpeg');

    this.chunksLoader = new ChunksLoader(
      this.track,
      this.sourceBuffer,
      this.logger
    );

    this.chunksLoader.begin();
  };

  private handleEvent = (e) => {

    const currentTime = this.audioObj.currentTime.toFixed(2);
    const totalDuration = this.track.duration.toFixed(2);
    this.logger.warn(`MediaSource event: ${e.type} (${currentTime} of ${totalDuration} sec)`);

    this.currentTime$.next(this.audioObj.currentTime);

    const currentStatus = this.state$.getValue();

    switch (e.type) {
      case 'playing':
        currentStatus.status = StreamStatus.PLAYING;
        this.state$.next(currentStatus);
        break;
      case 'pause':
        currentStatus.status = StreamStatus.PAUSED;
        this.state$.next(currentStatus);
        break;
      case 'timeupdate':
        this.chunksLoader.handleOnTimeUpdate(this.audioObj.currentTime);
        break;
      case 'seeking':
        currentStatus.status = StreamStatus.SEEKING;
        this.state$.next(currentStatus);
        if (this.mediaSource.readyState == 'open') {
          this.sourceBuffer.abort();
        }
        this.chunksLoader.handleOnTimeUpdate(this.audioObj.currentTime);
        break;
    }
  };

  private addEvents(obj, events, handler) {
    events.forEach(event => obj.addEventListener(event, handler));
  }

}

运行它会给我以下输出:

在此处输入图像描述

为屏幕截图道歉,但不可能在没有 Chrome 中所有堆栈跟踪的情况下仅复制输出。

我也尝试过遵循这个例子并打电话sourceBuffer.abort(),但这没有用。它看起来更像是几年前曾经使用过的 hack,但它仍然在文档中被引用(请参阅“示例”->“您可以在 Nick Desaulnier 的 bufferWhenNeeded 演示中看到类似的操作.. ”)。

case 'seeking':
  currentStatus.status = StreamStatus.SEEKING;
  this.state$.next(currentStatus);        
  if (this.mediaSource.readyState === 'open') {
    this.sourceBuffer.abort();
  } 
  break;

尝试使用 MP3

我通过将曲目转换为 MP3 在 Chrome 下测试了上述代码:

ffmpeg -i input.mp3 -acodec aac -b:a 256k -f mp3 output.mp3

并创建一个SourceBufferusing audio/mpegas 类型:

this.mediaSource.addSourceBuffer('audio/mpeg')

我在寻找时遇到了同样的问题。

没有寻求问题

上面的代码还有一个问题:

播放两分钟后,音频播放开始卡顿并过早停止。因此,音频播放到一定程度,然后在没有任何明显原因的情况下停止。

canplay无论出于何种原因,还有另一个playing事件。几秒钟后,音频停止了。

在此处输入图像描述

4

0 回答 0