1

当您MediaItemaudio_service中设置时,您还不知道歌曲的持续时间,因为此时 just_audio还没有要告诉您的更改。

常见问题解答说要更新MediaItem如下:

modifiedMediaItem = mediaItem.copyWith(duration: duration);
AudioServiceBackground.setMediaItem(modifiedMediaItem);

但我不清楚如何或在哪里做到这一点。GitHub 存储库中的示例应用程序通过提供预先计算的时间来回避这个问题。(GitHub问题

如何以及在哪里将持续时间从 just_audio 转移到 audio_service 以便它可以更新听众?

我发现了一些有用的东西,所以我在下面添加了一个答案。

4

1 回答 1

7

设置完just_audio 后,您AudioPlayer可以侦听持续时间流中的更改,然后在那里进行更新:

_player.durationStream.listen((duration) {
  final songIndex = _player.playbackEvent.currentIndex;
  print('current index: $songIndex, duration: $duration');
  final modifiedMediaItem = mediaItem.copyWith(duration: duration);
  _queue[songIndex] = modifiedMediaItem;
  AudioServiceBackground.setMediaItem(_queue[songIndex]);
  AudioServiceBackground.setQueue(_queue);
});

备注

  • 这是在您的audio_service BackgroundAudioTask类中。
  • 当我尝试_player.currentIndex直接使用时,我遇到了奇怪的行为(前两首歌曲0在索引开始增加之前都有索引)(GitHub问题)。这就是为什么我在这里使用播放事件来获取当前索引。
  • 对于我的示例,我使用的List<MediaItem>是队列。我实际上并不需要在最后一行中使用 setQueue,因为 UI 没有监听队列中的更改,但我认为无论如何这样做是件好事。

更完整的代码示例

这是我的全部background_audio_service.dart供参考。这是对文档示例的改编:

import 'dart:async';

import 'package:audio_service/audio_service.dart';
import 'package:audio_session/audio_session.dart';
import 'package:just_audio/just_audio.dart';

void audioPlayerTaskEntrypoint() async {
  AudioServiceBackground.run(() => AudioPlayerTask());
}

class AudioPlayerTask extends BackgroundAudioTask {
  AudioPlayer _player = new AudioPlayer();
  AudioProcessingState _skipState;
  StreamSubscription<PlaybackEvent> _eventSubscription;

  List<MediaItem> _queue = [];
  List<MediaItem> get queue => _queue;

  int get index => _player.playbackEvent.currentIndex;
  MediaItem get mediaItem => index == null ? null : queue[index];

  @override
  Future<void> onStart(Map<String, dynamic> params) async {
    _loadMediaItemsIntoQueue(params);
    await _setAudioSession();
    _propogateEventsFromAudioPlayerToAudioServiceClients();
    _performSpecialProcessingForStateTransistions();
    _loadQueue();
  }

  void _loadMediaItemsIntoQueue(Map<String, dynamic> params) {
    _queue.clear();
    final List mediaItems = params['data'];
    for (var item in mediaItems) {
      final mediaItem = MediaItem.fromJson(item);
      _queue.add(mediaItem);
    }
  }

  Future<void> _setAudioSession() async {
    final session = await AudioSession.instance;
    await session.configure(AudioSessionConfiguration.music());
  }

  void _propogateEventsFromAudioPlayerToAudioServiceClients() {
    _eventSubscription = _player.playbackEventStream.listen((event) {
      _broadcastState();
    });
  }

  void _performSpecialProcessingForStateTransistions() {
    _player.processingStateStream.listen((state) {
      switch (state) {
        case ProcessingState.completed:
          onStop();
          break;
        case ProcessingState.ready:
          _skipState = null;
          break;
        default:
          break;
      }
    });
  }

  Future<void> _loadQueue() async {
    AudioServiceBackground.setQueue(queue);
    try {
      await _player.load(ConcatenatingAudioSource(
        children:
            queue.map((item) => AudioSource.uri(Uri.parse(item.id))).toList(),
      ));
      _player.durationStream.listen((duration) {
        _updateQueueWithCurrentDuration(duration);
      });
      onPlay();
    } catch (e) {
      print('Error: $e');
      onStop();
    }
  }

  void _updateQueueWithCurrentDuration(Duration duration) {
    final songIndex = _player.playbackEvent.currentIndex;
    print('current index: $songIndex, duration: $duration');
    final modifiedMediaItem = mediaItem.copyWith(duration: duration);
    _queue[songIndex] = modifiedMediaItem;
    AudioServiceBackground.setMediaItem(_queue[songIndex]);
    AudioServiceBackground.setQueue(_queue);
  }

  @override
  Future<void> onSkipToQueueItem(String mediaId) async {
    final newIndex = queue.indexWhere((item) => item.id == mediaId);
    if (newIndex == -1) return;
    _skipState = newIndex > index
        ? AudioProcessingState.skippingToNext
        : AudioProcessingState.skippingToPrevious;
    _player.seek(Duration.zero, index: newIndex);
  }

  @override
  Future<void> onPlay() => _player.play();

  @override
  Future<void> onPause() => _player.pause();

  @override
  Future<void> onSeekTo(Duration position) => _player.seek(position);

  @override
  Future<void> onFastForward() => _seekRelative(fastForwardInterval);

  @override
  Future<void> onRewind() => _seekRelative(-rewindInterval);

  @override
  Future<void> onStop() async {
    await _player.dispose();
    _eventSubscription.cancel();
    await _broadcastState();
    await super.onStop();
  }

  /// Jumps away from the current position by [offset].
  Future<void> _seekRelative(Duration offset) async {
    var newPosition = _player.position + offset;
    if (newPosition < Duration.zero) newPosition = Duration.zero;
    if (newPosition > mediaItem.duration) newPosition = mediaItem.duration;
    await _player.seek(newPosition);
  }

  /// Broadcasts the current state to all clients.
  Future<void> _broadcastState() async {
    await AudioServiceBackground.setState(
      controls: [
        MediaControl.skipToPrevious,
        if (_player.playing) MediaControl.pause else MediaControl.play,
        MediaControl.skipToNext,
      ],
      androidCompactActions: [0, 1, 2],
      processingState: _getProcessingState(),
      playing: _player.playing,
      position: _player.position,
      bufferedPosition: _player.bufferedPosition,
      speed: _player.speed,
    );
  }

  /// Maps just_audio's processing state into into audio_service's playing
  /// state. If we are in the middle of a skip, we use [_skipState] instead.
  AudioProcessingState _getProcessingState() {
    if (_skipState != null) return _skipState;
    switch (_player.processingState) {
      case ProcessingState.none:
        return AudioProcessingState.stopped;
      case ProcessingState.loading:
        return AudioProcessingState.connecting;
      case ProcessingState.buffering:
        return AudioProcessingState.buffering;
      case ProcessingState.ready:
        return AudioProcessingState.ready;
      case ProcessingState.completed:
        return AudioProcessingState.completed;
      default:
        throw Exception("Invalid state: ${_player.processingState}");
    }
  }
}

然后在我的状态管理课上,我得到了一个AudioService这样的流:

Stream<AudioPlayerState> get mediaStateStream =>
    Rx.combineLatest2<Duration, MediaItem, AudioPlayerState>(
        AudioService.positionStream,
        AudioService.currentMediaItemStream,
        (position, mediaItem) => AudioPlayerState(position, mediaItem.duration));

AudioPlayerState 在哪里

class AudioPlayerState {
  const AudioPlayerState(this.currentTime, this.totalTime);
  final Duration currentTime;
  final Duration totalTime;

  const AudioPlayerState.initial() : this(Duration.zero, Duration.zero);
}

StreamBuilder在 Flutter UI 中使用了一个来收听mediaStateStream和更新我的音频播放器搜索栏小部件。

于 2020-12-07T10:56:04.753 回答