[foms] Adaptive streaming

Jeroen Wijering jeroen at longtailvideo.com
Fri Oct 29 07:43:50 PDT 2010


On Oct 27, 2010, at 5:51 PM, Mark Watson wrote:

>> Do files with a compacted index still play on devices that do not expect it? Think current mobile phones.
> 
> In DASH its just a fragmented MP4 file with an extra box ("Segment Index") near the beginning. Existing players would ignore the new box.

Do you have an example video? I'm curious to see one and do a few tests. 


>> If WebM / Ogg by its nature already has much smaller headers, one of the big drawbacks of range-request streaming would not be valid for these containers. 
> 
> I need to read up on the WebM format (bit of a confession, being on this list, sorry!). But I understood you have the framing information with the samples and therefore smaller headers and probably no need for any kind of formal fragmentation. So your "fragment" would just be a notional concept of a group of samples spanning some time period. Your index would provide time and byte offsets for these fragments. If you already have something which provides time and byte offsets for Random Access Points (for seeking) you could probably re-use that. (Re-using Movie Fragment Random Access box in mp4 was discussed instead of creating the new Segment Index. Segment Index was chosen for some slightly obscure technical reasons).

Just did a quick test with a 15-minute WebM video (keyframes 2-6s):

http://content.bitsontherun.com/videos/a95zAVN1-710492.webm

The first Clusterheader appeared at 44k. 

That's pretty nice compared to an MP4 conversion with the same keyframe interval, where the MOOV box is 480k:

http://content.bitsontherun.com/videos/a95zAVN1-600332.mp4

So it looks like my concern with range requests (startup delays due to big headers) is not valid for WebM files....

----

If the videoElement would expose a callback with valid range-requests to javascript, the scripting layer would be aware of the valid ranges in a quality level. For example something like "onSeekpoints" that returns:

[
 {position: 0,000,start: 8839, end:9957},
 {position: 1,287,start: 9957, end:10994},
 {position: 3,071,start: 10995, end:19840}
]

Next, this info can be used to do adaptive streaming using the appendVideo function:

videoElement.appendVideo("http://example.com/video_800.webm", 8839, 9957);

With this, plús some way to retrieve the bandwidth / bytesLoaded, it should be possible to build a fully functional WebM adaptive streaming demo in javascript - using existing WebM files. 

Would this work? Are there oversights? What do the FOMS browser developers think of such a first step?

Kind regards,

Jeroen


More information about the foms mailing list