[foms] WebM Manifest
Philip Jägenstedt
philipj at opera.com
Thu Mar 17 02:14:18 PDT 2011
On Wed, 16 Mar 2011 17:03:13 +0100, Mark Watson <watsonm at netflix.com>
wrote:
> Hi Philip,
>
> A couple of comments below...
>
> On Mar 15, 2011, at 2:01 AM, Philip Jägenstedt wrote:
>
>> On Mon, 14 Mar 2011 22:56:23 +0100, Frank Galligan
>> <fgalligan at google.com>
>> wrote:
>>
>>> If the Web and TV IG choose the DASH manifest as the baseline manifest
>>> format for adaptive streaming, would all of you be OK with implementing
>>> it
>>> in your products?
>>
>> In short, no.
>>
>> I've previously glanced over the DASH spec and have done so again today,
>> and I am not very enthusiastic about it at all.
>>
>> Since the spec is written for a non-browser context, it fails to make
>> any
>> use of existing browser infrastructure. Everything is done
>> declaratively,
>> whereas in a browser context one can leave many things to be dealt with
>> using scripts. I think we should aim for a solution that doesn't
>> strictly
>> require fetching a manifest file over HTTP repeatedly, we could just as
>> well build a solution using WebSockets, just to name one possibility.
>>
>
> This was deliberate as it was recognized that HTML/Javascript
> environments were not the only ones where adaptive streaming would be
> needed. We wanted to have a solution which was independent of the
> presentation framework and I think it would be valuable for the industry
> to have such a single solution, rather the multiple solutions for
> different environments.
Would it be an acceptable outcome if one can deliver to browsers using a
DASH manifest with some JavaScript glue to parse the manifest and map that
to the lower level APIs that browsers provide?
> DASH does not require repeatedly fetching a manifest. The scenarios
> where repeated fetching is necessary are some specific live scenarios,
> but both live and on-demand can be implemented with a single manifest
> fetch.
OK, could you explain a bit which live scenarios require refetching and
which don't and how that works?
>> My position is that we should start from the bottom, implementing APIs
>> in
>> browsers that make it possible to implement adaptive streaming using
>> scripts. If we succeed at that, then DASH could be implemented as a
>> JavaScript library. When we have that low-level API in place I think we
>> should look at simplifying it for authors by looking at a manifest
>> format,
>> but I truly doubt taking DASH wholesale would be the best long-term
>> solution for either browser implementors or web authors.
>>
>
> When you say "low level API" do you mean the ability to provide
> information to the player about the various available streams ? Or do
> you mean even lower, where the API allows you to provide raw media data,
> or URLs for media chunks, with all the rate adaptation etc. implemented
> in the Javascript layer ?
The lowest level that I think we should provide for is fetching multiple
resource (chunks) using XMLHttpRequest and concatenating these into a
Stream object to which one can continuously append new chunks. Certainly
one could allow completely script-generated data to be spliced in between
chunks, if there's any utility in this. Bitrate switching logic would have
to implemented in JavaScript based on current playback position, download
speed, etc.
When we have this done, I would suggest adding another layer to take away
the complexity of the switching logic. Something like an event that is
fired when it looks like we will run out of data might be enough.
One can of course add more and more layers on top of this, up to and
including a manifest format that takes care of everything.
> The latter was discussed before on this list. It is very attractive from
> the point of view of enabling experimentation with rate adaptation
> algorithms, but my conclusion after the discussion was that practically
> it would be difficult to come up with an API that was rich enough to
> enable meaningful experimentation. A simple "switch up/down" method call
> is not sufficient to implement a working adaptation algorithm: you need
> to know about switch points, byte ranges within streams, stream VBR
> profiles and a more detailed view of past throughput than just an
> instantaneous or fixed window throughput measure.
We'll have to provide the buffering and playback statistics that are
needed to make it work. Browsers would have to have this information
internally to be able to implement adaptive streaming "natively", so it's
just a question of exposing it to JavaScript.
> An alternative approach suggested at the Web&TV working group was that
> players might support a number of different algorithms of varying
> maturity, much as OS kernels support a variety of TCP congestion control
> algorithms. There would need to be an API to discover and configure the
> various algorithms.
Yeah, that's not a bad idea, and is not in conflict with providing the
tools necessary to implement other algorithms completely in JavaScript.
>> I think that those browser vendors that are interested in a streaming
>> solution using open formats get together and start to discuss a
>> technical
>> solution. This list or the WHATWG would be sensible venues for that,
>> IMO.
>>
>
> When you say "browser vendors", who do you mean ? If you mean
> Opera/Mozilla/Google/Apple/Microsoft then I think the stakeholders for
> this topic include a much wider range of companies. Web technologies are
> finding application in many new environments where adaptive streaming is
> important (TVs and TV-connected devices being the most interesting for
> my company). The W3C recently set up the Web & TV Interest group and
> that might also be a good venue to get involvement from more
> stakeholders.
Yes, I mean those web browser vendors. I've already joined the Web & TV IG
and am following that discussion too. The work needs to get done, I'm not
fussed about the venue.
--
Philip Jägenstedt
Core Developer
Opera Software
More information about the foms
mailing list