OSMF HLS Plugin

http streaming hero

Having been involved in the Together project, I was assigned a task to enable Apple HLS video playback on the Flash platform. Video content delivery in a single format (HLS in this case) is usually very easy and offers many benefits. To process video, Flash has an open source OSMF framework that can be easily enhanced with various plugins. But there is one problem: the framework is absolutely HLS-agnostic. Adobe promoted RTMP first, and only then offered HTTP Dynamic Streaming (HDS) as an alternative to Apple HLS. In this post, we’ll cover a free HLS plugin that we have developed to run HLS in OSMF-enabled video players.

Our HLS OSMF plugin has been published on GitHub. Let’s examine it more closely.

The plugin has been based on Matthew’s HLS plugin (alas, the download link is broken, but a fork on github is available). I have revised the plugin to enable correct multi-bitrate streaming and added support for DVR streaming. The video stream processing part is OK, so I have left it intact. In fact, this part fetches H.264 video from the MPEG TS stream and uses NetStream appendBytes to play it back. But the part responsible for handling m3u8 playlist has been completely rewritten. The plugin mechanism is identical to HDS (video stream manager and index file manager).

First, let’s discuss examples of the plugin use.

Use:
1. In StrobeMediaPlayback video player:
– Connect HLSDynamicPlugin.swf as any other plugin. In the flashvars variable, enter:

1
2
3
4
flashvars = {
…,
hls_plugin: "url/to/HLSDynamicPlugin.swf"
}

– Statically attach HLSPlugin.swc to the StrobeMediaPlayback project and in the

1
onChromeProviderComplete(event: Event)

function add after:

1
factory = injector.getInstance(<strong>MediaFactory</strong>);

the following code:

1
2
3
factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD, onLoadPlugin);
factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onError);
factory.loadPlugin(new PluginInfoResource(new HLSPluginInfo()));

2. For your own OSMF player you have two options:
– Statically attach HLSPlugin.swc to the project and load it using DefaultMediaFactory:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
private function initPlayer():void {
  var factory:DefaultMediaFactory = new DefaultMediaFactory();
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD, onLoadPlugin);
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onError);
  factory.loadPlugin(new PluginInfoResource(new HLSPluginInfo()));
  var res:URLResource =new URLResource( HLS_VIDEO );
  var element:MediaElement = factory.createMediaElement(res);
  if (element == null) throw new Error('Unsupported media type!');
  var player:MediaPlayer = new MediaPlayer(element);
  var container:MediaContainer = new MediaContainer();
  container.addMediaElement(element);
  container.scaleX = .75;
  container.scaleY = .75;
  addChild(container);
}

– use HLSDynamicPlugin.swf, loading it dynamically using DefaultMediaFactory:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
private function initPlayer():void {
  var factory:DefaultMediaFactory = new DefaultMediaFactory();
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD, onLoadPlugin);
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onError);
  factory.loadPlugin(new URLResource( URL_TO_PLUGIN ));
  function onComplete(e: ParseEvent):void {
    var res:URLResource = new URLResource(HLS_VIDEO);
    var element:MediaElement = factory.createMediaElement(res);
    if ( element == null) throw new Error ('Unsupported media type!');
    var player:MediaPlayer = new MediaPlayer(element);
    var container:MediaContainer = new MediaContainer();
    container.addMediaElement( element );
    container.scaleX = .75;
    container.scaleY = .75;
    addChild(container);</strong>
  }
  function onError(e:ParseEvent):void {
    trace("plugin load error!");
  }
}

Now, all you need is to pass to the player a link to your video stream (m3u8 playlist). Enjoy your video!

Now, let’s consider the technical features of the plugin.

What do we have "under the hood"?

First, let’s read the HLS Specification.

In the specs, you’ll find answers to various questions, such as: What is an m3u8 playlist, and what is written to it? How to properly create an HLS stream? How to parse this stream properly on the client side? So, if you need to add processing of some intricate tag to the plugin, you can not do without the specs.

Now that you know how to prepare videos for the plugin, let’s discuss the internal processing:

1. Create an M3U8Element for the m3u8 playlist based on the link retrieved from URLResource. The M3U8Element uses M3U8Loader to download the playlist needed.
2. The playlist downloaded (which is, essentially, a text file) is passed to M3U8PlaylistParser for processing. The output is the M3U8Playlist containing M3U8Items (or other M3U8Playlists, in case of multi-bitrate streaming).
3. At this stage, we already have our playlist in a "binary" format. Feeling a temptation to send it to processing right now, but… Here we need to create HLSDynamicStreamingResource (inherited from DynamicStreamingResource), based on the playlist data parsed. Without this procedure, OSMF can not properly handle a multi-bitrate playlist.
4. Now, our HLSDynamicStreamingResource is passed back to the player, instead of the original URLResource.
5. Now we have the resource, so what are we waiting for? Let’s launch the video! Stop, stop, stop, OSMF will tell you, transforming your playlist (i.e., the current resource) into HTTPStreamingHLSFactory.
6. And this IndexInfo is almost an OSMF-ready product. "Almost?!" you are likely to lose your temper here. "Yes, almost," as this is just the data used by HTTPStreamingHLSIndexHandler to load the main content, i.e., the video stream.
7. And what do we have now? Here’s what:
HTTPStreamingHLSIndexHandler receives a command from OSMF: getNextFile(). This occurs either by a timer or in case of a rewind. Accordingly, in response to this OSMF is requested to download the next chunk from the playlist. When our video chunk has been downloaded and HTTPStreamSource takes on the READ status, it says to HTTPStreamingMP2TSFileHandler: use processFileSegment() to process what I have downloaded (we’ll analyze more details of this later)
8. If HTTPStreamingHLSIndexHandler has not found the end of the playlist yet, it requests the next chunk of the video stream (we go to step 6), continuing so until the playlist end.

Seemingly, that’s the end of the story, but again it’s not :). An inquisitive reader has probably already wondered: What do we have in case of live streaming? The playlist is not ending in it, after all." Well, no wonder there as well: when the index of the current segment is greater than or equal to the length of the chunk list and the stream is live, HTTPStreamingHLSIndexHandler simply requests playlist restart, putting OSMF to the "waiting…" status. A very important nuance here, is: multi-bitrate streaming specs recommend to restart the current playlist only, rather than all playlists (the plugin behaves the same way). So, with this nuance, processing of the updated playlist is reduced to 2 steps:
– Parse with M3U8PlaylistParser
– Update current HTTPStreamingHLSIndexInfo

In the original Matthew’s version of the plugin, playlist processing was similar in all cases, and hence OSMF could not correctly process playlists for multi-bitrate streaming without DynamicStreamingResource.

And, once we have touched upon live streaming, let’s discuss the difference between DVR and conventional live streaming:

According to the standard, they have just a single thing in common: absence of the #EXT-X-ENDLIST tag in the end. And the only difference is the method of playlist update:

– In DVR, new chunks are "appended" to the playlist,
– In plain live streaming, the chunks are "rotated". It means that the playlist has a fixed length, but each time it comes with new chunks, the number in the #EXT-X-MEDIA-SEQUENCE tag is increased by the number of chunks updated.

Well, this is how the plugin plays back the HLS video.

Brief anatomy of TS video stream

Well, TS segment parsing is described in the same specification (and references cited therein). Here, I’m just going to show what the plugin is handling:

1. The first thing we need is the 0x47 byte (syncing), the handler looks for it to the last byte. If it fails to find it, it has received something "wrong".
2. Then, 187 service bytes follow, of which:
– The second byte contains: an error indicator (0x80 bit), payload unit start indicator (0x40 bit), and transport priority indicator (0x20 bit)
– Then follows the packet ID that determines further data processing
– The 4th byte contains: scrambling control bits, adaptation field bit, the payload data, as well as the continuity count.
Next, if the adaptation field bit is set, its length is read, and that’s all … the plugin is not processing this data and we go directly to the data itself (if they are available, which is signaled by the payload data bit):
Based on the packet ID values, the data may be different:
– Packet ID == 0: If the payload unit start indicator is set, the PAT data is processed; otherwise, an empty ByteArray is returned. The second option is not interesting to us, so let’s consider what is in PAT:
– The first byte is not used
– The second byte contains table ID
– Then follow 2 bytes of the table length
– Further 5 bytes are skipped
– From the subsequent bytes, pmtPID is read and stored for further processing.
– The last 4 bytes of CRC data are also skipped
– packet ID == pmtPID (that we have read previously): conditions are the same as for the PAT. What’s inside:
– The first byte is not used
– tableID (must be equal to 0x02)
– 2-bytes of table length
– 7 bytes are skipped (versions, reserved bytes, etc.)
– programInfo duration
– programInfo (skipped)
– Data type byte: (0x1b – H.264 video, 0x0f – AAC Audio / ADTS, the other data types are not processed by the plugin)
– data packet pid
– Length of the remaining data (used to skip them).
– CRC bytes are not used
– packet ID == audioPID (pid from PMT, if the type was audio) It just processes the audio stream. You can make yourself familiar with the processing, by looking at the processES function in the HTTPStreamingMP2PESAudio class
– packet ID == videoPID (same case as for the audio) Video stream processing. Similarly to the audio, I recommend you to look at the processES function, but in the HTTPStreamingMP2PESVideo class

Wishing seamless video playback to all your players! 🙂

Leave a Reply