How to play teletext from SDI to HLS/DASH¶
Teletext¶
Teletext is some extra data (usually text but may also be small images) that are transmitted within a TV signal from a satellite. The satellite video gets to Flussonic Media Server via SDI cards or immediately in MPEG-TS format. When receiving streams from the SDI card, Flussonic detects teletext in the SDI input to automatically pack it to the MPEG-TS stream without any changes, so further teletext processing (conversion to WebVTT or TTML) is carried out in the same way.
You may also find useful the article about packing teletext for transmission in MPTS/SPTS.
Reading teletext from SDI¶
Flussonic reads various formats of teletext from streams captured by an SDI board. Supported teletext formats for SDI inputs:
Card | VBI (SD SDI) | OP-47 (HD SDI) |
---|---|---|
DekTec | ✅ | ✅ |
Decklink | ✅ | ✅ |
Stream Labs | ✅ | ❌ |
Magewell | ❌ | ❌ |
AJA | ❌ | ❌ |
VBI is data located in the invisible area of the frame, which is transmitted during the vertical return of the beam (i.e., the time it takes the kinescope beam to return to the top of the screen in TVs with cathode ray tubes). With the advent of HD SDI, VBI has been superseded by VANC which allows more data to be transmitted in a single line.
OP-47 is teletext specification for HD SDI allowing for more stable and uninterrupted transmission of teletext with high-definition streams. It allows even more data to be transmitted in one line then VANC.
Passing teletext to HLS and DASH¶
Flussonic allows passing DVB teletext from MPEG-TS:
No additional options needed to configure the conversion. If there is teletext in the input stream, then Flussonic will automatically* convert it to WebVTT and TTML.
Configuration example (no additional options):
stream example_stream1 {
input tshttp://EXAMPLE-IP/STREAM_NAME/mpegts;
}
To check for dvb_teletext
in the input stream, you can run the following command:
ffprobe http://EXAMPLE-IP/STREAM_NAME/mpegts
Stream #0:0[0x447]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(tv, bt470bg, top first), 704x576 [SAR 16:11 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1[0xc12](eng): Audio: mp2 ([4][0][0][0] / 0x0004), 48000 Hz, stereo, fltp, 192 kb/s
Stream #0:2[0x17e2](swe,nor,dan,fin): Subtitle: dvb_teletext ([6][0][0][0] / 0x0006)
After the conversion, an output stream has teletext:
Passing closed captions in HLS and DASH¶
Signaling CEA-608/708 caption service metadata¶
In order for HLS and DASH players to display closed captions its presence should be explicit in the so called manifest files or just manifests. These files contain the URLs and names for the streams as well as adaptive bitrate information, closed captions, etc.
Flussonic can inform you of the presence of the closed captions in the stream.
To enable this feature add the following parameter to the configuration file (/etc/flussonic/flussonic.conf
) next to the stream's URL:
cc.{608|708}.{INSTREAM-ID}.{lang|name}=VALUE
Parameters:
608/708
- standard of the closed captions. The value is either 608 or 708 for CEA-608 and CEA-708 standards respectively.INSTREAM-ID
- channel number that has closed captions. It's an integer between 1 and 4 for CEA-608 and between 1 and 63 for CEA-708.lang
- language of the closed captions.name
- under what name the audio track will be displayed on the player.VALUE
- specified language value. Depending on whether it islang
orname
the value may differ.
For example, let's specify English as language of our closed captions. Then the configuration will look as follows:
- with
lang
option:cc.708.1.lang=eng
(ISO 639.2/B standard) - with
name
option:cc.708.1.name=English
(name of the track, that will be displayed on the player's closed captions option)
Usage example:
stream example_stream2 {
input tshttp://EXAMPLE-IP/STREAM_NAME/mpegts cc.708.12.lang=fr cc.608.1.lang=eng;
}
In the example above there are 2 closed captions tracks:
- 12th channel contains CEA-708 standard closed captions in French
- 1st channel contains CEA-608 standard closed captions in English
You can view the presence of closed captions in DASH and HLS manifests.
- for DASH it is achieved with the help of the
Accessibility
tag in the playlist file. You have to run thecurl
command to download it first:
curl http://FLUSSONIC-IP/example_stream2/index.mpd
Based on the previous example of the stream example_stream2
you can check that closed captions are present:
<Accessibility schemeIdUri="urn:scte:dash:cc:cea-608:2015" value="CC1=eng;CC1=eng"/>
<Accessibility schemeIdUri="urn:scte:dash:cc:cea-708:2015" value="12=lang:fr;12=lang:fr"/>
- the same steps for HLS, except that the playlist file format is slightly different:
curl http://FLUSSONIC-IP/example_stream2/index.m3u8
#EXT-X-MEDIA:TYPE=CLOSED-CAPTIONS,GROUP-ID="v1cc",LANGUAGE="fr",NAME="fr12",INSTREAM-ID="SERVICE12"
#EXT-X-MEDIA:TYPE=CLOSED-CAPTIONS,GROUP-ID="v1cc",LANGUAGE="eng",NAME="eng1",INSTREAM-ID="CC1",AUTOSELECT=YES,DEFAULT=YES
Extracting and converting closed captions¶
Flussonic allows the extraction of CEA-608 closed captions from the input stream and its further conversion:
- in WebVTT format — for HLS
- in WebVTT and TTML formats — for DASH.
After transcoding a stream that has embedded closed captions, Flussonic keeps the closed captions in the output stream.
For Flussonic to perform the extraction and further conversion add the option cc.extract
to the stream's URL.
For MPEG-TS streams:
stream example_stream3 {
input tshttp://EXAMPLE-IP/STREAM_NAME/mpegts cc.extract;
}
cc.extract
option is available on MPEG-TS sources.
Subtitles positioning¶
To position subtitles on the video set the parameter substyle valign=top|middle|bottom align=left|center|right
, for example:
stream example_stream5 {
input tshttp://EXAMPLE-IP/STREAM_NAME/mpegts cc.extract;
substyle valign=top align=left;
}
Subtitle settings can be added to the Output tab of stream settings:
After the conversion, an output stream has closed captions:
Choosing subtitles format for DASH playback¶
As two formats of subtitles are included in a DASH manifest, you can choose one of them when playing an output stream:
https://FLUSSONIC-IP/STREAM_NAME/index.mpd?text=wvtt
or
https://FLUSSONIC-IP/STREAM_NAME/index.mpd?text=ttml
(TTML is the default format)
Passing subtitles to MSS¶
Flussonic passes any type of subtitles (subtitles, closed captions, or teletext) in the TTML format to output MSS streams. No special configuration is required, the only requirement is that the incoming stream must have TTML subtitles.
You can configure the position of subtitles by adding substyle valign=top|middle|bottom align=left|center|right
in the incoming stream settings:
stream example_stream6 {
input tshttp://EXAMPLE-IP/STREAM_NAME/mpegts cc.extract;
substyle valign=top align=left;
}
About TTML subtitles¶
TTML (Timed Text Markup Language) is a standard for closed captioning and subtitling that is widely supported by media players, streaming platforms and other software, and also used in the television industry. The TTML standard offers rich features for positioning, alignment, styling, multiple languages, and so on. TTML subtitles are passed as an XML-based text file with the .ttml or .xml file extension.
Flussonic passes TTML subtitles to MSS and DASH streams.