Skip to content
William edited this page Dec 5, 2018 · 26 revisions

Introduction

This application is intended to serve as a reliable and scalable OTT streaming repackager to deliver content as part of an overall media streaming platform. There are two key variations of OTT streaming technologies that this software accommodates:

  1. HLS (HTTP Live Streaming) - most notably developed by Apple and very widely supported
  2. DASH (Dynamic Adaptive Streaming of HTTP) - developed more traditionally by a consortium

HLS is probably the most widely used variation and is supported by an extremely large ecosystem of devices. It works in browsers, set top boxes, phones/tablets, etc. DASH is also used but has a much more limited deployment footprint. It is also much more feature rich and is difficult to easily get comprehensive player interoperability. DASH is still not at the same level of maturity as HLS, but it is slowly gaining ground.
HLS started out as a transport stream based format that leveraged the traditional MPEG based broadcast standards. This made a lot of sense when Apple first deployed this model since a lot of content was already packaged in transport streams, analysis tools were readily available and it is very well understood by a large audience. It has been in use for a very long time! DASH has taken a slightly different approach by adopting the fragmented MP4 file format that I believe was originally developed by Apple (DASH does specify that you can use transport stream, but I don't think anyone actually does?). Having two different standards can really complicate deployment models, especially when some devices support one type and other devices support the other. It can get even more complicated when DRM and encryption are involved. Over the course of the last year or so, Apple has also announced support for the fragmented MP4 file format as part of their HLS specification. This should hopefully simplify things.

The most widely supported combination of protocols and codecs is transport stream based HLS with H.264 video codec and AAC audio codec. There are some devices supporting the newer HEVC video codec (mostly targeting 4K video!) and some devices that support the AC3 audio codec as well. VP9 and AV1 are also excellent alternative video codecs, but support for these in different end user devices is hit/miss. HLS with H.264 video and AAC audio is the best approach to reach the largest audience and should be your primary focus for deploying a service. Additional modes using DASH with other codecs can be included as supplemental media streams to target different devices and/or operational models, but should only be included if needed.

With this application, you can ingest *live* MPEG2 transport streams (containing H264/HEVC video and AAC audio) carried over UDP (Multicast or Unicast) for repackaging into HTTP Live Streaming (HLS) (both TS and MP4) and DASH output container formats. The application serves only as a repackaging solution and not as a full origin server or transcoder (at least for now!).

An OTT streaming platform typically has four key system components:
**1. Encoder/Transcoder (ffmpeg or commercial encoding solution)**
The encoder/transcoder will convert the input stream into a set of output streams compatible with HLS- H.264 video streams and AAC audio streams in transport streams.

**2. Packager (fillet - this application)**
The packager will be responsible for fragmenting/segmenting the source stream into smaller time/frame aligned chunks of content that can easily be served through a HTTP based web server for delivery to end devices.

**3. Origin Server (apache or nginx)**
This server is responsible for caching all of the content that the packager produces and makes it readily available for the edge. Archiving of content can be done here along with advanced PVR capabilities.

**4. Edge Server (apache or nginx)**
This server is responsible for caching and delivering all of the content to the end devices/players. Ad insertion is typically done here since it can be done on a per device/user level if needed. It is also possible to do ad insertion upstream as well, but it'll be less targeted/customized.

If you require an origin server for your deployment, examples of basic integration with the Nginx and Apache based web servers will be provided, but full configuration of those servers are outside the scope of these instructions (at least for now). HLS and DASH are both HTTP based and it is recommended that your service be deployed using HTTPS for best security. ffmpeg is a free and widely used encoder/transcoder that is more than capable of producing high quality streams that can be used with fillet. In addition, ffmpeg does have some packaging capabilities builtin and can be used for HLS based streaming/segmenting independently.

Current Status Update (as of 11/29/2018)

The fillet application is intended to package live content for HLS(TS), HLS(fMP4) and DASH(fMP4) however support for all three combinations is not entirely finished. The most complete packaging implementation is for the standard HLS(TS). I am still adding features and debugging the fragmented MP4 output for HLS and DASH modes. I have enabled all packaging modes by default in the application, but will soon expose these as different configuration options. You could then run several instances of the fillet application with different parameters and output stream combinations (i.e., have a mobile stream group and a set top box stream group with some streams overlapping). If you do run multiple instances using the same source content, you will want to receive the streams from a multicast source instead of unicast.

The HLS(TS) mode is robust and is running in some production environments today. The fragmented MP4 output modes are not fully ready for production, but feel free to experiment and provide feedback. The HLS(TS) output mode is very reliable and can handle stream discontinuities quite well. I started developing this packager because there wasn't anything budget friendly available that would reliably work in a production environment. A lot of people I work with don't have million dollar budgets. I am also hoping to make this as simple as possible to setup, use and deploy. I will work on statistics collection/reporting once I get closer to an official release.

Finally, the HLS(TS) mode that is implemented in the fillet application writes out separate video and audio transport stream files and signals them in separate manifests. I implemented the packaging of the audio and video in a file rotation so that the older files are replaced by newer files (this is so the drive doesn't fill up and doesn't have to rely on an outside script to purge old files). The rollover point is controlled in the code and is set to 128 files. One of the additional features of the fillet application is its resiliency to source stream discontinuities and/or signal interruptions. I take advantage of the #EXT-X-DISCONTINUITY flag in the manifest to deal with these types of situations so that the player doesn't have to restart or get into a strange state. I am also caching some of the context data in /var/tmp with a filename of hlsmux_state_#####, where ##### is the instance identity as you will see in the usage guidelines. This is so that if you stop/restart the application, it will come back up right where it left off. If you want to restart the sequence numbering, you can delete this state file in /var/tmp when the application is not running.

An example of the manifest files that are written is shown below:
cannonbeach@insanitywave:$:/var/www/html/hls# more video0.m3u8
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-MEDIA-SEQUENCE:14740
#EXT-X-TARGETDURATION:5
#EXTINF:5.01,
video_stream0_20.ts
#EXTINF:5.01,
video_stream0_21.ts
#EXTINF:4.99,
video_stream0_22.ts
#EXT-X-DISCONTINUITY
#EXTINF:5.01,
video_stream0_23.ts
#EXTINF:5.01,
video_stream0_24.ts
#EXTINF:4.99,
video_stream0_25.ts

cannonbeach@insanitywave:$:/var/www/html/hls# more audio0_substream0.m3u8
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-MEDIA-SEQUENCE:14740
#EXT-X-TARGETDURATION:5
#EXTINF:5.01,
audio_stream0_substream_0_20.ts
#EXTINF:5.01,
audio_stream0_substream_0_21.ts
#EXTINF:4.99,
audio_stream0_substream_0_22.ts
#EXT-X-DISCONTINUITY
#EXTINF:5.01,
audio_stream0_substream_0_23.ts
#EXTINF:5.01,
audio_stream0_substream_0_24.ts
#EXTINF:4.99,
audio_stream0_substream_0_25.ts

At the present time, only command line execution is available.

Installation

The software install guide here is for Ubuntu 16.04 server only, however, you can run this on older/newer versions of Ubuntu as well as in Docker containers for AWS/Google cloud based deployments.
cannonbeach@insanitywave:$ sudo apt install git
cannonbeach@insanitywave:$ sudo apt install build-essential
cannonbeach@insanitywave:$ sudo apt install libz-dev
cannonbeach@insanitywave:$ git clone https://github.com/cannonbeach/ott-packager.git
cannonbeach@insanitywave:$ cd ott-packager
cannonbeach@insanitywave:$ make

The above steps will compile the application (it is named "fillet"). Please ensure that you already have a basic development environment setup.

The fillet application must be run as a user with root privileges, otherwise it will not work.

usage: fillet [options]

   --sources    [NUMBER OF ABR SOURCES - MUST BE >= 1 && <= 10]
   --ip         [IP:PORT,IP:PORT,etc.] (Please make sure this matches the number of sources)
   --interface  [SOURCE INTERFACE - lo,eth0,eth1,eth2,eth3]
                If multicast, make sure route is in place (see note below)
   --window     [WINDOW IN SEGMENTS FOR MANIFEST]
   --segment    [SEGMENT LENGTH IN SECONDS]
   --manifest   [MANIFEST DIRECTORY "/var/www/html/hls/"]
   --identity   [RUNTIME IDENTITY - any number, but must be unique across multiple instances of fillet]
   --background [RUN IN BACKGROUND - NOT YET FULLY SUPPORTED]

Example:

cannonbeach@insanitywave:$ sudo ./fillet --sources 2 --ip 127.0.0.1:4000,127.0.0.1:4200 --interface lo --window 5 --segment 5 --manifest /var/www/html/hls --identity 1000

This command line tells the application that there are two unicast sources that contain audio and video on the loopback interface. The manifests and output files will be placed into the /var/www/html/hls directory. If you are using multicast, please make sure you have multicast routes in place on the interface you are using, otherwise you will not receive the traffic.

cannonbeach@insanitywave:$ sudo route add -net 224.0.0.0 netmask 240.0.0.0 dev eth0

Troubleshooting

The following methods are useful to diagnose issues:

  1. Check for streaming network traffic
    You can use tcpdump (or wireshark) to see if content is coming in on the network.
    To see all of the UDP traffic on a specific network interface

cannonbeach@insanitywave:$ sudo tcpdump -n udp -i eth0
17:45:26.293482 IP 10.0.0.5.46439 > 239.192.1.1.4400: UDP, length 1316
17:45:26.293987 IP 10.0.0.5.44706 > 239.192.1.1.4200: UDP, length 1316
17:45:26.294350 IP 10.0.0.5.44577 > 239.192.1.1.4000: UDP, length 1316
17:45:26.294738 IP 10.0.0.5.46439 > 239.192.1.1.4400: UDP, length 1316
If you are receiving traffic, it will display information about the packets on the interface. You need to use Ctrl-C to break out and stop the capture.

2. Multicast reception issues

Check for the multicast route:
cannonbeach@insanitywave:$ sudo route add -net 224.0.0.0 netmask 240.0.0.0 dev eth0

Check to make sure your multicast traffic is not being rejected by the kernel:
In /etc/sysctl.conf, there are two entries that control reverse-path filtering. In some instances depending on how your network is setup, you may have to disable reverse-path filtering. Older variations of Linux had this enabled by default, but it can cause issues with multicast coming from a different subnet.

net.ipv4.conf.default.rp_filter=0
net.ipv4.conf.all.rp_filter=0

You can then run:
cannonbeach@insanitywave:$ sudo sysctl -p
to reread the /etc/sysctl.conf file.

Features TODO List

MP4 output mode (HLS and DASH) - still finishing DASH interop
- DASH streaming now works on Roku, but I am still having trouble with Chrome and Edge using the DASHIF player- if anyone has some ideas on what the problem might be, that would be helpful. It appears to be related to the timing (maybe the availabilityStartTime or publish time?).

HEVC repackaging
Background mode (run as Linux service)
REST API
Apache/Nginx integration guide (this is just mostly documentation)
Widevine integration
AC3 audio (combo AC3 and AAC)
WebVTT caption output
TTML caption output
Audio only mode
WebDAV publishing
Archiving mode
Docker container deployment instructions
Ad-insert
Blackout
Dynamic segment sizes
Restart on reboot
Stats collection/monitoring

I could also really use some test content with SCTE35. If anyone has any, please send me an email.

Commercial Usage

The application is free to use for commercial and/or private deployments.

I do offer fee based consulting so please send me an email if you are interested in retaining me for any support issues or feature development. I have several support models available and can provide more details upon request. You can reach me at: cannonbeachgoonie@gmail.com

Clone this wiki locally