OSMF HLS Plugin

http streaming hero

Having been involved in the Together project, I was assigned a task to enable Apple HLS video playback on the Flash platform. Video content delivery in a single format (HLS in this case) is usually very easy and offers many benefits. To process video, Flash has an open source OSMF framework that can be easily enhanced with various plugins. But there is one problem: the framework is absolutely HLS-agnostic. Adobe promoted RTMP first, and only then offered HTTP Dynamic Streaming (HDS) as an alternative to Apple HLS. In this post, we’ll cover a free HLS plugin that we have developed to run HLS in OSMF-enabled video players.

Our HLS OSMF plugin has been published on GitHub. Let’s examine it more closely.

The plugin has been based on Matthew’s HLS plugin (alas, the download link is broken, but a fork on github is available). I have revised the plugin to enable correct multi-bitrate streaming and added support for DVR streaming. The video stream processing part is OK, so I have left it intact. In fact, this part fetches H.264 video from the MPEG TS stream and uses NetStream appendBytes to play it back. But the part responsible for handling m3u8 playlist has been completely rewritten. The plugin mechanism is identical to HDS (video stream manager and index file manager).

First, let’s discuss examples of the plugin use.

Use:
1. In StrobeMediaPlayback video player:
- Connect HLSDynamicPlugin.swf as any other plugin. In the flashvars variable, enter:

1
2
3
4
flashvars = {
…,
hls_plugin: "url/to/HLSDynamicPlugin.swf"
}

- Statically attach HLSPlugin.swc to the StrobeMediaPlayback project and in the

1
onChromeProviderComplete(event: Event)

function add after:

1
factory = injector.getInstance(<strong>MediaFactory</strong>);

the following code:

1
2
3
factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD, onLoadPlugin);
factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onError);
factory.loadPlugin(new PluginInfoResource(new HLSPluginInfo()));

2. For your own OSMF player you have two options:
- Statically attach HLSPlugin.swc to the project and load it using DefaultMediaFactory:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
private function initPlayer():void {
  var factory:DefaultMediaFactory = new DefaultMediaFactory();
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD, onLoadPlugin);
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onError);
  factory.loadPlugin(new PluginInfoResource(new HLSPluginInfo()));
  var res:URLResource =new URLResource( HLS_VIDEO );
  var element:MediaElement = factory.createMediaElement(res);
  if (element == null) throw new Error('Unsupported media type!');
  var player:MediaPlayer = new MediaPlayer(element);
  var container:MediaContainer = new MediaContainer();
  container.addMediaElement(element);
  container.scaleX = .75;
  container.scaleY = .75;
  addChild(container);
}

- use HLSDynamicPlugin.swf, loading it dynamically using DefaultMediaFactory:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
private function initPlayer():void {
  var factory:DefaultMediaFactory = new DefaultMediaFactory();
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD, onLoadPlugin);
  factory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onError);
  factory.loadPlugin(new URLResource( URL_TO_PLUGIN ));
  function onComplete(e: ParseEvent):void {
    var res:URLResource = new URLResource(HLS_VIDEO);
    var element:MediaElement = factory.createMediaElement(res);
    if ( element == null) throw new Error ('Unsupported media type!');
    var player:MediaPlayer = new MediaPlayer(element);
    var container:MediaContainer = new MediaContainer();
    container.addMediaElement( element );
    container.scaleX = .75;
    container.scaleY = .75;
    addChild(container);</strong>
  }
  function onError(e:ParseEvent):void {
    trace("plugin load error!");
  }
}

Now, all you need is to pass to the player a link to your video stream (m3u8 playlist). Enjoy your video!

Now, let’s consider the technical features of the plugin.

What do we have "under the hood"?

First, let’s read the HLS Specification.

In the specs, you’ll find answers to various questions, such as: What is an m3u8 playlist, and what is written to it? How to properly create an HLS stream? How to parse this stream properly on the client side? So, if you need to add processing of some intricate tag to the plugin, you can not do without the specs.

Now that you know how to prepare videos for the plugin, let’s discuss the internal processing:

1. Create an M3U8Element for the m3u8 playlist based on the link retrieved from URLResource. The M3U8Element uses M3U8Loader to download the playlist needed.
2. The playlist downloaded (which is, essentially, a text file) is passed to M3U8PlaylistParser for processing. The output is the M3U8Playlist containing M3U8Items (or other M3U8Playlists, in case of multi-bitrate streaming).
3. At this stage, we already have our playlist in a "binary" format. Feeling a temptation to send it to processing right now, but… Here we need to create HLSDynamicStreamingResource (inherited from DynamicStreamingResource), based on the playlist data parsed. Without this procedure, OSMF can not properly handle a multi-bitrate playlist.
4. Now, our HLSDynamicStreamingResource is passed back to the player, instead of the original URLResource.
5. Now we have the resource, so what are we waiting for? Let’s launch the video! Stop, stop, stop, OSMF will tell you, transforming your playlist (i.e., the current resource) into HTTPStreamingHLSFactory.
6. And this IndexInfo is almost an OSMF-ready product. "Almost?!" you are likely to lose your temper here. "Yes, almost," as this is just the data used by HTTPStreamingHLSIndexHandler to load the main content, i.e., the video stream.
7. And what do we have now? Here’s what:
HTTPStreamingHLSIndexHandler receives a command from OSMF: getNextFile(). This occurs either by a timer or in case of a rewind. Accordingly, in response to this OSMF is requested to download the next chunk from the playlist. When our video chunk has been downloaded and HTTPStreamSource takes on the READ status, it says to HTTPStreamingMP2TSFileHandler: use processFileSegment() to process what I have downloaded (we’ll analyze more details of this later)
8. If HTTPStreamingHLSIndexHandler has not found the end of the playlist yet, it requests the next chunk of the video stream (we go to step 6), continuing so until the playlist end.

Seemingly, that’s the end of the story, but again it’s not :) . An inquisitive reader has probably already wondered: What do we have in case of live streaming? The playlist is not ending in it, after all." Well, no wonder there as well: when the index of the current segment is greater than or equal to the length of the chunk list and the stream is live, HTTPStreamingHLSIndexHandler simply requests playlist restart, putting OSMF to the "waiting…" status. A very important nuance here, is: multi-bitrate streaming specs recommend to restart the current playlist only, rather than all playlists (the plugin behaves the same way). So, with this nuance, processing of the updated playlist is reduced to 2 steps:
- Parse with M3U8PlaylistParser
- Update current HTTPStreamingHLSIndexInfo

In the original Matthew’s version of the plugin, playlist processing was similar in all cases, and hence OSMF could not correctly process playlists for multi-bitrate streaming without DynamicStreamingResource.

And, once we have touched upon live streaming, let’s discuss the difference between DVR and conventional live streaming:

According to the standard, they have just a single thing in common: absence of the #EXT-X-ENDLIST tag in the end. And the only difference is the method of playlist update:

- In DVR, new chunks are "appended" to the playlist,
- In plain live streaming, the chunks are "rotated". It means that the playlist has a fixed length, but each time it comes with new chunks, the number in the #EXT-X-MEDIA-SEQUENCE tag is increased by the number of chunks updated.

Well, this is how the plugin plays back the HLS video.

Brief anatomy of TS video stream

Well, TS segment parsing is described in the same specification (and references cited therein). Here, I’m just going to show what the plugin is handling:

1. The first thing we need is the 0×47 byte (syncing), the handler looks for it to the last byte. If it fails to find it, it has received something "wrong".
2. Then, 187 service bytes follow, of which:
- The second byte contains: an error indicator (0×80 bit), payload unit start indicator (0×40 bit), and transport priority indicator (0×20 bit)
- Then follows the packet ID that determines further data processing
- The 4th byte contains: scrambling control bits, adaptation field bit, the payload data, as well as the continuity count.
Next, if the adaptation field bit is set, its length is read, and that’s all … the plugin is not processing this data and we go directly to the data itself (if they are available, which is signaled by the payload data bit):
Based on the packet ID values, the data may be different:
- Packet ID == 0: If the payload unit start indicator is set, the PAT data is processed; otherwise, an empty ByteArray is returned. The second option is not interesting to us, so let’s consider what is in PAT:
- The first byte is not used
- The second byte contains table ID
- Then follow 2 bytes of the table length
- Further 5 bytes are skipped
- From the subsequent bytes, pmtPID is read and stored for further processing.
- The last 4 bytes of CRC data are also skipped
- packet ID == pmtPID (that we have read previously): conditions are the same as for the PAT. What’s inside:
- The first byte is not used
- tableID (must be equal to 0×02)
- 2-bytes of table length
- 7 bytes are skipped (versions, reserved bytes, etc.)
- programInfo duration
- programInfo (skipped)
- Data type byte: (0x1b – H.264 video, 0x0f – AAC Audio / ADTS, the other data types are not processed by the plugin)
- data packet pid
- Length of the remaining data (used to skip them).
- CRC bytes are not used
- packet ID == audioPID (pid from PMT, if the type was audio) It just processes the audio stream. You can make yourself familiar with the processing, by looking at the processES function in the HTTPStreamingMP2PESAudio class
- packet ID == videoPID (same case as for the audio) Video stream processing. Similarly to the audio, I recommend you to look at the processES function, but in the HTTPStreamingMP2PESVideo class

Wishing seamless video playback to all your players! :-)

22 thoughts on “OSMF HLS Plugin

  1. Seems like it doesn’t support AES decryption yet. Does it have a priority?

  2. Hi, Great implementation!

    I have a few questions i hope you can answer.

    Would you able to integrate OVA for as3 into a custom osmf player?

    Would you be able to extend the plugin to add support for dropped and loss packets from the media server side?

    Can you please send me an email answering those questions and if you are available for freelance work?

    Thanks.

  3. Your plugin is good, but one thing that I\\’ve noticed, when it is a \\”live\\” stream and you seek backwards, it tries to \\”cache up\\” a whole lot of segments before continuing playing.

  4. nice work!!! with vod and live stream the plugin works fine…but with dvr stream fails when you rewind to any second. example: when you click on the timeline the second X the player start buffering and downloading segments from the seconds 0 until reach the second X then continue playing. excuse my english. regards.

  5. When do you plan to implement the standard HLS encryption mechanism (using AES and a method of secure key distribution) in your plugin ?

  6. Hello ! Thanks for the great work !!

    For me VOD and Live are working fine. But in DVR, I could not make seeking functionality with control bar. can anyone guide me ?

  7. Video stalls when it switches from lower bandwidth to higher for brief moment.

    Any advice ?
    Thanks in advance

  8. One more observation, When I pause the video players keeps loading segments until end. Its should stop like normal HDS at maxBufferTime usually 30 Sec.

  9. I am live broadcasting hls and play .m3u8 file on StrobeMediaPlayback.html
    So it is work but buffering and live acivity related problem.
    i change hls related parameter but nothing change.
    so how can i feel live streaming using this.
    so i required your help,
    Mine Email:dipen@shangyeguwen.com
    Thank in Advance…

    Regards,
    Dipen

  10. Hi Vick,
    Yes, you’re right, we don’t have buffer limit for chunks at the moment. Regarding not smooth quality change we need more details on how do you do that (automatic or manual switch).
    Thanks,
    Denis.

  11. We’ve never tested this feature. I think no. If you feel the feature to be very important then please submit an issue onto the GitHub project page.

Leave a Reply

Your email address will not be published. Required fields are marked *

Please type the characters of this captcha image in the input box

Please type the characters of this captcha image in the input box