Skip to content

MSE player

Here we will tell how to use our open-sourced MSE player in your applications to provide low latency video playback in the browser. The player has long been offered in our embed.html mechanism.

Why MSE Player?

  1. Uses HTML5 and doesn't require Flash, which means it is supported by any client device (browser, smartphone)
  2. Has a list of advantages in comparison with WebRTC, such as: WebRTC requires UDP, and MSE requires HTTP, which makes it easier for MSE to pass through corporate firewalls and proxies. In addition, WebRTC and MSE support different codecs, for example, audio: MSE supports AAC audio codec, as opposed to WebRTC. It means that a playback of a TV channel is easier with the help of the MSE Player.

You can see the MSE Player in some parts of Flussonic and Watcher UI and can also access it from the browser via the following URL:

http://flussonic-ip/STREAMNAME/embed.html?realtime=true

The mechanism that is used by Flussonic is described in HTML5 (MSE-LD) Low Latency Playback

You can use its JavaScript module in your frontend projects. The sources are published to Github.

On this page:

Installation in your app

Step 1.
Run the following command:

npm install --save @flussonic/flussonic-mse-player

Step 2.
Import it into JS:

import FlussonicMsePlayer from '@flussonic/flussonic-mse-player'
...
const player = new FlussonicMsePlayer(element, url, opts)

Sample app with webpack and our MSE player

You can find the source code of MSE Player on Github.

The FlussonicMsePlayer class

var player = new FlussonicMsePlayer(element, streamUrl, opts)

Parameters:

  • element<video> a DOM element
  • streamUrl — the URL of a stream
  • opts — player options.

You can monitor MSE Player with Sentry, setting the sentryConfig(string) parameter (see the Table below).
Player options (opts) include the following settings:

Parameters Type Description
progressUpdateTime integer (seconds) time period after which the player will provide the information about the playback progress
errorsBeforeStop integer number of playback errors that will be processed by the player until a complete stop
connectionRetries integer number of retries to establish a connection before the player stops
preferHQ boolean if set to true, player will automatically select the highest available quality of the stream
retryMuted boolean if set to true, player will try to restart the playing process with initialy muted sound
maxBufferDelay integer maximum buffer delay value. If a live playback lags behind the real time by more than the specified value, the excess is discarded
sentryConfig string DSN from Sentry

Methods

Method Description
play() start playing
stop() stop playing
setTracks([videoTrackId, audioTrackId]) set up video and audio tracks for a playback
getVideoTracks() return available video tracks (should be used in the onMediaInfo callback method)
getAudioTracks() return available audio tracks (should be used in the onMediaInfo callback method)

Event callbacks

Event Description
onProgress(currentTime) triggered every 100ms while a stream is playing and gives current playback time
onMediaInfo(metadata) triggered when metadata of the stream is available. The metadata includes a common information of the stream such as width, height, information about mbr streams and so on. After this callback triggered you can use getVideoTracks()/getAudioTracks() methods

Using mutli-bitrate tracks

Let's consider a video stream that has three video tracks: v1(800k), v2(400k), v3(200k) and two audio tracks: a1(32k), a2(16k).

To set default tracks to v2 and a1, add the tracks URL parameter with track numbers:

'ws://flussonic-ip/stream_name/mse_ld?tracks=v2a1'

And then pass this URL to the player constructor.

You can get all available video/audio tracks:

  • inside onMediaInfo(metadata) callback, by parsing metadata:
{
  width: ...,
  height: ...,
  streams: [
    {
      track_id: "v1", bitrate: ..., codec: ..., content: "video", fps: ..., ...
      ...

      track_id: "a1", bitrate: ..., codec: ..., content: "audio", fps: ..., ...  
    }
  ]
}
  • inside onMediaInfo(metadata) by calling getVideoTracks()/getAudioTracks() methods.

To set tracks for a playback, use the setTracks([videoTrackId, audioTrackId]) method.

Complete example

<html>
  <head>

  </head>
  <style>
    .player-container {
      border: 1px solid black;
    }

    #player {
      position: relative;
      width: 100%;
    }

    .mbr-controls {
      display: none;
    }
  </style>
<body>
  <div class="player-container">
    <video id="player"/>
  </div>
  <div class="mbr-controls">
    <div>
      <label for="videoTracks">video tracks</label>
      <select name="videoTracks" id="videoTracks"></select>
    </div>
    <div>
      <label for="audioTracks">audio tracks</label>
      <select name="audioTracks" id="audioTracks"></select>
    </div>
    <button onclick="window.setTracks()">set tracks</button>
  </div>

  <button onclick="window.player.play()">Play</button>
  <button onclick="window.player.stop()">Stop</button>
  <script type="text/javascript" src="/flu/assets/FlussonicMsePlayer.js"></script>
  <script>
    window.onload = onLoad();

      function onLoad() {

        var element = document.getElementById('player');
        var videoTracksSelect = document.getElementById('videoTracks');
        var audioTracksSelect = document.getElementById('audioTracks');
        var mbrControls = document.querySelector('.mbr-controls');

        var url = (window.location.protocol == "https:" ? "wss:" : "ws:")+ '//'+window.location.host+'/clock/mse_ld';

        window.player = new FlussonicMsePlayer(element, url);

        window.player.onProgress = function(currentTime) {
          console.log(currentTime);
        };

        window.player.onMediaInfo = (rawMetaData) => {
          var videoTracks = window.player.getVideoTracks()
          var audioTracks = window.player.getAudioTracks()
          var videoOptions = videoTracks.map((v, i) => (
            `<option value="${v['track_id']}">${v['bitrate']} ${v['codec']} ${v['fps']} ${v['width']}x${v['height']}</option>`
          ))

          var audioOptions = audioTracks.map(v => (
            `<option value="${v['track_id']}">${v['bitrate']} ${v['codec']} ${v['lang']}</option>`
          ))

          videoTracksSelect.innerHTML = videoOptions.join('')
          audioTracksSelect.innerHTML = audioOptions.join('')

          mbrControls.style.display = 'block'
        }

        window.setTracks = () => {
          var videoTrackId = videoTracksSelect.options[videoTracksSelect.selectedIndex].value
          var audioTrackId = audioTracksSelect.options[audioTracksSelect.selectedIndex].value

          window.player.setTracks([videoTrackId, audioTrackId])
        }
      }
  </script>
</body>
</html>

Statistics on the MSE Player

When you initialize the player, add the config variable:

this.player = new FlussonicMsePlayer(this._videoElement, url, config);

The MSE Player has the onStats option that should be passed to the config parameter. It returns an object, containing statistics on the player's buffers and the timestamp of when the statistics was obtained.

Adding controls like in a desktop player (Flussonic 20.10)

The MSE Player now supports new controls the same as found in usual desktop players, such as pause, resume or unmute. The controls are part of MediaElement, which can be attached to the player as a separate part after initializing.

Using the attachMedia(element) method, you can attach a <video /> element to the player separately, after initializing the player. You can also pass the MediaElement through the player parameters, then it will be attached automatically.

To control the player via MediaElement, you will need the following events: onMediaAttached, onPause, onResume.

Using the onMediaAttached event, you know exactly when the player is attached to the <video /> and is ready to start playing.

Here is a usage example:

onMediaAttached: () => {
      element.play()
    },

The player listens to the native HTTP events of the <video /> element (in which you pass the player to the web page), such as play/pause/unmute, and can respond to them. Using onPause and onResume, you can respond to pause and resume playback events. For example, you can draw interface elements (like a large pause icon).