我正在尝试使用 Webrtc 在 Android 上构建屏幕共享应用程序。我可以使用 mediaprojection+webrtc 共享屏幕,但无法共享系统音频。Mediaprojection 在 AudioPlaybackCaptureConfiguration 的帮助下增加了对从 API 29 (Android 10) 捕获系统音频的支持。但是当我将音频源从音频记录分配给对等连接音轨时,应用程序崩溃了。
MediaProjectionManager mediaProjectionManager =
(MediaProjectionManager) mContext.getApplicationContext().getSystemService(
Context.MEDIA_PROJECTION_SERVICE);
MediaProjection sMediaProjection =
mediaProjectionManager.getMediaProjection(
MPResultCode,
MPData
);
AudioPlaybackCaptureConfiguration config = new AudioPlaybackCaptureConfiguration.Builder(sMediaProjection)
.addMatchingUsage(AudioAttributes.USAGE_MEDIA)
.build();
AudioFormat audioFormat = new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setSampleRate(8000)
.setChannelMask(AudioFormat.CHANNEL_IN_MONO)
.build();
AudioRecord audioRecord = new AudioRecord.Builder()
.setAudioFormat(audioFormat)
.setBufferSizeInBytes(BUFFER_SIZE_IN_BYTES)
.setAudioPlaybackCaptureConfig(config)
.build();
AudioSource audioSource = new AudioSource(audioRecord.getAudioSource());
AudioTrack localAudioTrack = factory.createAudioTrack("AudioTrack", audioSource1);
localAudioTrack.setEnabled(true);
mLocalMediaStream.addTrack(localAudioTrack);
如果我如下配置音频源,则流式麦克风音频工作正常
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
如何使用 AudioRecord 对象配置 webrtc 音轨?