MSE Player
Here we will tell how to use our open-sourced MSE player in your applications to provide low latency video playback in the browser. The player has long been offered in our embed.html
mechanism.
Why MSE Player?
- Uses HTML5 and doesn't require Flash, which means it is supported by any client device (browser, smartphone)
- Has a list of advantages in comparison with WebRTC, such as: WebRTC requires UDP, and MSE requires HTTP, which makes it easier for MSE to pass through corporate firewalls and proxies. In addition, WebRTC and MSE support different codecs, for example, audio: MSE supports AAC audio codec, as opposed to WebRTC. It means that a playback of a TV channel is easier with the help of the MSE Player.
You can see the MSE Player in some parts of Flussonic and Watcher UI and can also access it from the browser via the following URL:
http://flussonic-ip/STREAMNAME/embed.html?realtime=true
The mechanism that is used by Flussonic is described in HTML5 (MSE-LD) Low Latency Playback
You can use its JavaScript module in your frontend projects. The sources are published to Github.
On this page:
- Installation in your app
- Using mutli-bitrate tracks
- Complete example
- Viewing multiple DVR archives in sync
- Statistics on the MSE Player
- Adding controls as in a desktop player
Installation in your app
Quick start without NPM
There are a few simple steps you have to follow:
Step 1.
Download the module for the support of the MSE Player at:
http://flussonic-ip:80/flu/assets/FlussonicMsePlayer.js
Step 2.
Add the script to your HTML file:
<script type="text/javascript" src="/flu/assets/FlussonicMsePlayer.js"></script>
Step 3.
Initialize the player and attach it to a <video/>
element.
Step 4.
Start playing:
...
<body>
<video id="player"/>
...
<script type="text/javascript">
window.onload = function() {
var element = document.getElementById('player');
window.player = new FlussonicMsePlayer(element, streamUrl);
window.player.play();
}
</script>
</body>
...
Installing with NPM and webpack
Step 1.
Run the following command:
npm install --save @flussonic/flussonic-mse-player
Step 2.
Import it into JS:
import FlussonicMsePlayer from '@flussonic/flussonic-mse-player'
...
const player = new FlussonicMsePlayer(element, url, opts)
Sample app with webpack and our MSE player
You can find the source code of MSE Player on Github.
The FlussonicMsePlayer class
var player = new FlussonicMsePlayer(element, streamUrl, opts)
Parameters:
element
—<video>
a DOM elementstreamUrl
— the URL of a streamopts
— player options.
You can monitor MSE Player with Sentry, setting the sentryConfig(string)
parameter (see the Table below).
Player options (opts
) include the following settings:
Parameters | Type | Description |
---|---|---|
progressUpdateTime |
integer (seconds) | time period after which the player will provide the information about the playback progress |
errorsBeforeStop |
integer | number of playback errors that will be processed by the player until a complete stop |
connectionRetries |
integer | number of retries to establish a connection before the player stops |
preferHQ |
boolean | if set to true , player will automatically select the highest available quality of the stream |
retryMuted |
boolean | if set to true , player will try to restart the playing process with initialy muted sound |
maxBufferDelay |
integer | maximum buffer delay value. If a live playback lags behind the real time by more than the specified value, the excess is discarded |
sentryConfig |
string | DSN from Sentry |
Methods:
Method | Description |
---|---|
play() |
start playing |
stop() |
stop playing |
setTracks([videoTrackId, audioTrackId]) |
set up video and audio tracks for a playback |
getVideoTracks() |
return available video tracks (should be used in the onMediaInfo callback method) |
getAudioTracks() |
return available audio tracks (should be used in the onMediaInfo callback method) |
Event callbacks
Event | Description |
---|---|
onProgress(currentTime) |
triggered every 100ms while a stream is playing and gives current playback time |
onMediaInfo(metadata) |
triggered when metadata of the stream is available. The metadata includes a common information of the stream such as width, height, information about mbr streams and so on. After this callback triggered you can use getVideoTracks()/getAudioTracks() methods |
Using mutli-bitrate tracks
Let's consider a video stream that has three video tracks: v1(800k), v2(400k), v3(200k) and two audio tracks: a1(32k), a2(16k).
To set default tracks to v2 and a1, add the tracks
URL parameter with track numbers:
'ws://flussonic-ip/stream_name/mse_ld?tracks=v2a1'
And then pass this URL to the player constructor.
You can get all available video/audio tracks:
- inside
onMediaInfo(metadata)
callback, by parsingmetadata
:
{
width: ...,
height: ...,
streams: [
{
track_id: "v1", bitrate: ..., codec: ..., content: "video", fps: ..., ...
...
track_id: "a1", bitrate: ..., codec: ..., content: "audio", fps: ..., ...
}
]
}
- inside
onMediaInfo(metadata)
by callinggetVideoTracks()/getAudioTracks()
methods.
To set tracks for a playback, use the setTracks([videoTrackId, audioTrackId])
method.
Complete example
<html>
<head>
</head>
<style>
.player-container {
border: 1px solid black;
}
#player {
position: relative;
width: 100%;
}
.mbr-controls {
display: none;
}
</style>
<body>
<div class="player-container">
<video id="player"/>
</div>
<div class="mbr-controls">
<div>
<label for="videoTracks">video tracks</label>
<select name="videoTracks" id="videoTracks"></select>
</div>
<div>
<label for="audioTracks">audio tracks</label>
<select name="audioTracks" id="audioTracks"></select>
</div>
<button onclick="window.setTracks()">set tracks</button>
</div>
<button onclick="window.player.play()">Play</button>
<button onclick="window.player.stop()">Stop</button>
<script type="text/javascript" src="/flu/assets/FlussonicMsePlayer.js"></script>
<script>
window.onload = onLoad();
function onLoad() {
var element = document.getElementById('player');
var videoTracksSelect = document.getElementById('videoTracks');
var audioTracksSelect = document.getElementById('audioTracks');
var mbrControls = document.querySelector('.mbr-controls');
var url = (window.location.protocol == "https:" ? "wss:" : "ws:")+ '//'+window.location.host+'/clock/mse_ld';
window.player = new FlussonicMsePlayer(element, url);
window.player.onProgress = function(currentTime) {
console.log(currentTime);
};
window.player.onMediaInfo = (rawMetaData) => {
var videoTracks = window.player.getVideoTracks()
var audioTracks = window.player.getAudioTracks()
var videoOptions = videoTracks.map((v, i) => (
`<option value="${v['track_id']}">${v['bitrate']} ${v['codec']} ${v['fps']} ${v['width']}x${v['height']}</option>`
))
var audioOptions = audioTracks.map(v => (
`<option value="${v['track_id']}">${v['bitrate']} ${v['codec']} ${v['lang']}</option>`
))
videoTracksSelect.innerHTML = videoOptions.join('')
audioTracksSelect.innerHTML = audioOptions.join('')
mbrControls.style.display = 'block'
}
window.setTracks = () => {
var videoTrackId = videoTracksSelect.options[videoTracksSelect.selectedIndex].value
var audioTrackId = audioTracksSelect.options[audioTracksSelect.selectedIndex].value
window.player.setTracks([videoTrackId, audioTrackId])
}
}
</script>
</body>
</html>
Viewing multiple DVR archives in sync
Flussonic allows viewing several DVR archives at once and, furthermore, navigate all the archives in sync.
In order to play multiple DVRs in the Flussonic MSE Player, you need to create a stream that contains several other streams that have DVR. Users will be able to view these streams in mosaic mode and seek in sync by using a single timeline.
When you initialize the player, add the config
variable:
this.player = new FlussonicMsePlayer(this._videoElement, url, config);
The config
variable is the object that contains the player's configuration. Add the settings of a DVR mosaic to the config
object with the streamingChannels
key.
In the example we create a DVR mosaic, but omit stream names, e.g. no names will be displayed.
Example of a 3х2 mosaics of DVR archives (without stream names)
streamingChannels: {
// cols: 3, // Number of columns in the mosaic (optional)
// rows: 2, // Number of rows in the mosaic (optional)
streams: [
{
subName: 'camera01', // Stream name, it must match the name in the UI
main: true, // The stream will be selected as the default (optional)
auth_token: 'example', // Authorization token
address: 'example' // Path to another server with Flussonic parameter)
order: 1 // Order streams from left to right (optional)
},
{
subName: 'camera02',
order: 2
},
{
subName: 'camera03',
order: 3
},
{
subName: 'camera04',
order: 4
},
{
subName: 'camera05',
order: 5
},
]
}
Stream names in a DVR mosaic
Stream names are not displyed in the DVR player (embed.html?dvr=true
) by default. However, they are supported in the multi-DVR view, where they make it easier to distinguish one stream from another (for example, when viewing streams from many cameras).
To display the name of each stream in the DVR player in mosaic mode, pass streams in the streamigChannels
key and add the options renderTitles
and title
(optional).
Configuring DVR mosaic with stream names:
streamingChannels: {
{
renderTitles: true, // Show stream names (optional)
cols: 3, // Number of columns in the mosaic (optional)
rows: 2, // Number of rows in the mosaic (optional)
streams: [
{
subName: 'camera01', // Stream name, it must match the name in the UI
title: 'Door', // The title to be displayed in the player (optional)
auth_token: 'example', // Authorization token
address: 'example' // Path to another server with Flussonic (optional)
main: true, // The stream will be selected as the default (optional)
order: 1 // Order streams from left to right (optional)
},
{
subName: 'camera02',
title: 'Garage',
order: 2
},
// and so on
]
}
Statistics on the MSE Player
When you initialize the player, add the config
variable:
this.player = new FlussonicMsePlayer(this._videoElement, url, config);
The MSE Player has the onStats
option that should be passed to the config
parameter. It returns an object, containing statistics on the player's buffers and the timestamp of when the statistics was obtained.
Adding controls like in a desktop player (Flussonic 20.10)
The MSE Player now supports new controls the same as found in usual desktop players, such as pause, resume or unmute. The controls are part of MediaElement, which can be attached to the player as a separate part after initializing.
Using the attachMedia(element)
method, you can attach a <video />
element to the player separately, after initializing the player. You can also pass the MediaElement through the player parameters, then it will be attached automatically.
To control the player via MediaElement, you will need the following events: onMediaAttached
, onPause
, onResume
.
Using the onMediaAttached
event, you know exactly when the player is attached to the <video />
and is ready to start playing.
Here is a usage example:
onMediaAttached: () => {
element.play()
},
The player listens to the native HTTP events of the <video />
element (in which you pass the player to the web page), such as play/pause/unmute
, and can respond to them. Using onPause
and onResume
, you can respond to pause and resume playback events. For example, you can draw interface elements (like a large pause icon).