[foms] What data is needed for adaptive stream switching?

Frank Galligan fgalligan at google.com
Mon Nov 29 08:35:56 PST 2010


On Wed, Nov 24, 2010 at 4:20 AM, Jeroen Wijering
<jeroen at longtailvideo.com>wrote:

>
> On Nov 23, 2010, at 10:24 PM, Chris Pearce wrote:
>
> > Thanks for explaining Mark. Much appreciated.
> >
> > On 24/11/2010 6:51 a.m., Mark Watson wrote:
> >> This is where there is scope for experimentation. What I think would be
> great is to define an API which can indicate these decision points, provide
> the two data sets (past incoming bandwidth) and (future bitrate of each
> stream) at some sufficient level of generality and indicate back the
> decision. Then we can experiment with more and less complex input data and
> more and less complex decision algorithms.
> >
> > So in terms of what changes to browsers we'd need to make to start
> > experimenting, we'd need to resurrect the @bufferedBytes attribute, add
> > a @currentOffset attribute, and add some way for JS to access the
> > RAP/keyframe index? Maybe we should add the @bufferedBytes data into
> > @buffered, so you can easily map buffered time ranges to byte ranges? I
> > guess these would have to be per-stream if we're playing multiple
> > independent streams.
> >
> > Or would you prefer an explicit download bandwidth and a per stream
> > bitrate measure, calculated by the browser, over a specified time window?
>
> A per-stream @bufferedBytes / @currentOffset would be perfect. No need for
> global bandwidth IMO.
>
> As to framedrops / CPU load: I think a @framesDropped / @framesDecoded
> attribute pair still makes most sense.
>
I don't think @framesDecoded would be very useful by itself because of VFR
material. Adding @framesRendered over the same interval would be useful. The
player could see how many frames were decoded but not rendered to see if
there was a CPU issue. I think adding @avgJitter over the same interval
could be useful. If @framesRendered == @framesDecoded but @avgJitter > 0,
that could signify there still was a CPU issue.

This still doesn't address the hidden frames issue that Andy brought up. We
could add @framesHidden or if the video is hidden just have the media
pipeline increase @framesDecoded and @framesRendered for each frame decoded,
as I don't think th player really cares that frame was truly rendered if the
video is hidden.

Frank


>
> > In terms of an API which can indicate decision points, maybe an event
> > which fires when playback enters a new chunk? Or fires when the download
> > of a chunk finishes? Be aware we can't guarantee that any DOM events
> > which we fire will arrive in a particularly timely manner, there could
> > be any number of other things going on with the event queue.
>
> Depending upon the way the chunks are loaded into the videoElement
> (<stream>s with range-requests? Stream() API with chunks?), there might
> already be a way to derive whether a chunk has loaded (polling
> @bufferedBytes). An event would be nice, but not needed at present I think.
>
> Likewise, @currentTime of a  videoElement should be enough to determine
> whether a new chunk is already playing or not, for now.
>
>
> Kind regards,
>
> Jeroen
> _______________________________________________
> foms mailing list
> foms at lists.annodex.net
> http://lists.annodex.net/cgi-bin/mailman/listinfo/foms
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.annodex.net/cgi-bin/mailman/private/foms/attachments/20101129/41859bf3/attachment.htm 


More information about the foms mailing list