2

I would like to create a simple video streaming app by using HTML5 features + Node.js on the server side (actually I am not sure this is possible...). Now I have some important questions:

  • Is it possible to record the local stream created by the navigator.getUserMedia() API? I read lots of articles but everywhere it is just used to define the source of a HTML5 video element.
  • Is it possible to send this stream through websockets? (socket.io, binaryjs,... ?). Otherwise I can only imagine sending frames to a canvas element, and I am not sure that is a good solution.
  • Is there any Node.js module that supports load balancing? It would be interesting to work with different node servers.

Thank you all in advance.

4

2 回答 2

2

MediaStreamRecorder 是一个用于记录 getUserMedia() 流的 WebRTC API。它允许 Web 应用程序从实时音频/视频会话创建文件。

 <video autoplay></video>

    <script language="javascript" type="text/javascript">
    function onVideoFail(e) {
        console.log('webcam fail!', e);
      };

    function hasGetUserMedia() {
      // Note: Opera is unprefixed.
      return !!(navigator.getUserMedia || navigator.webkitGetUserMedia ||
                navigator.mozGetUserMedia || navigator.msGetUserMedia);
    }

    if (hasGetUserMedia()) {
      // Good to go!
    } else {
      alert('getUserMedia() is not supported in your browser');
    }

    window.URL = window.URL || window.webkitURL;
    navigator.getUserMedia  = navigator.getUserMedia || 
                             navigator.webkitGetUserMedia ||
                              navigator.mozGetUserMedia || 
                               navigator.msGetUserMedia;

    var video = document.querySelector('video');
    var streamRecorder;
    var webcamstream;

    if (navigator.getUserMedia) {
      navigator.getUserMedia({audio: true, video: true}, function(stream) {
        video.src = window.URL.createObjectURL(stream);
        webcamstream = stream;
    //  streamrecorder = webcamstream.record();
      }, onVideoFail);
    } else {
        alert ('failed');
    }

    function startRecording() {
        streamRecorder = webcamstream.record();
        setTimeout(stopRecording, 10000);
    }
    function stopRecording() {
        streamRecorder.getRecordedData(postVideoToServer);
    }
    function postVideoToServer(videoblob) {

        var data = {};
        data.video = videoblob;
        data.metadata = 'test metadata';
        data.action = "upload_video";
        jQuery.post("http://www.foundthru.co.uk/uploadvideo.php", data, onUploadSuccess);
    }
    function onUploadSuccess() {
        alert ('video uploaded');
    }

    </script>

    <div id="webcamcontrols">
        <button class="recordbutton" onclick="startRecording();">RECORD</button>
    </div>

http://www.w3.org/TR/mediastream-recording/

更新:

MediaElements 必须有新的方法来捕获流,看看下面的文章

https://developers.google.com/web/updates/2016/10/capture-stream

于 2013-05-28T05:36:28.517 回答
1

您可以使用RecordRTC录制视频。这里有一个演示

您可以使用“MediaSource”API 从视频元素中捕获预先录制的媒体;获取数组缓冲区 / blob / unit8Array 并使用 XMLHttpRequest 或其他方法上传该数组(块)。

您可以通过 WebSocket/Socet.io/Firebase/etc 发送这些“笨蛋”。无需使用 RTCWeb API 即可进行实时流式传输!!!!但是,有很多陷阱:一个巨大的类型数组。要传输的大数据。此外,这些 API 仅适用于 HTML5 视频元素。对 Chrome Canary 和 Firefox 的有限支持。

好消息是,MediaSource API 会在获得第一个块后立即播放视频。在播放视频之前,它不会等待整个视频/数据被下载。

于 2013-03-03T13:15:00.517 回答