Now Flussonic is also an all-in-one headend. It receives various streams (satellite, terrestrial and others), transcodes them, records them and sends MPTS to ATSC-C cable networks.
The “transponder” functionality has been implemented in Flussonic for some time now (it is a universal transport stream multiplexer) and what sets it apart from other multiplexers is:
- that supports different types of sources (not just MPEG-TS)
- because of its extensive possibilities in terms of source failover, since in case of unavailability of the main source, an automatic seamless switch to the alternative one will occur. It also allows one file to be used as a redundant source. For example, to display a video with entertainment content if the main video with useful content is not available.
Therefore, the transponder formed an MPTS stream from different types of sources and sent it to an IP UDP modulator for subsequent transmission to satellite, cable or terrestrial.
In this new release, the functionality of the transponder is greatly expanded. Now Flussonic replaces the modulator in the context of delivering signals to ATSC-C cable networks. With a TBS card, Flussonic can itself send an MPTS stream to the ATSC-C cable network. This simplifies the organization of linear television in installations with existing cable infrastructure such as hotels, hospitals, urbanizations, shopping centers, etc. It is not necessary to install, maintain IRD receivers and modulation devices to receive IP, terrestrial or other channels and distribute them by cable to all the houses. Now only Flussonic is enough.
We continue to improve WebRTC. This release adds support for WebRTC-HTTP ingest protocol (WHIP) and WebRTC-HTTP access protocol (WHAP). Unlike the current WebRTC over WebSocket, WHIP allows you to publish not only from the browser, but, using plugins, also from programs such as OBS (Open Broadcaster Software). Also, WHIP support will allow adding a load balancer for publish servers to Flussonic in future releases. Therefore, the streams published via WebRTC to Flussonic will be distributed depending on the availability and load of a particular server.
A detailed article about how and why to use transcoder redundancy has been added to the documentation. (Using the ‘cluster_ingest’ option for transcoding ensures that if one transcoder fails, its streams will be recaptured by other transcoders).
The size of the Flussonic web interface has been reduced from about 11 MB to 3.5 MB. Now it loads faster.
A few months ago, we published an API specification for Flussonic Media Server, according to which you can integrate with it – create and turn off streams, configure a transcoder, DVR, etc. This was the admin part of the API regarding exactly how to manage the media server.
In this release, a specification has been published according to which it is possible to develop a media player that interacts with Flussonic (or integrate an existing one). This is a Streaming API scheme for playing and publishing videos. The Streaming API is necessary for those who develop web interfaces and mobile applications. Now such services can integrate players with the media server deeper and much faster, thanks to automatic link generation and the ability to use code generation.
The specification describes the required fields in the response from Flussonic to the player. For example, if a player wants to request information about an archive, the specification shows where the player can go, what fields can be expected. An example is provided. Therefore, all the available fields and all the features of Flussonic are immediately available to developers, and there is no need to go into the documentation “manually” and check the relevance of the parameters.
OpenAPI gives you the opportunity to develop your application without real Flussonic.
WebRTC player now can capture screen and play screencasts.
For deinterlacing, Flussonic Coder can now use the CUDA yadif method. That is, in addition to the transcoding capacities of jetson, we also began to use CUDA cores on which the yadif filter works. Therefore, the video quality after deinterlacing (especially for dynamic scenes) has become better.
A client mosaic was made in the DVD player. In multi-window mode, you can view both live and archive from multiple cameras at once. There is no need to individually open and rewind the video from each camera. A single timeline allows simultaneous rewinding of several video streams. This is convenient for viewing incidents at a single facility (yard, office, factory, etc.).