[theora-dev] Theora integration question

Timothy B. Terriberry tterribe at xiph.org
Tue Oct 16 20:25:27 PDT 2012


Engineering wrote:
> For example, I'm not quite sure what 'dct_tokens' is, but mallocing that in
> oc_dec_init() seems to be putting me over the edge. I notice that the size
> of those mallocs looks similar to what I'd expect the RGB pixel data to take
> for each movie.

The total size of dct_tokens is targeted towards the worst-case, with 
the expectation that the OS will support overcommit semantics, so that a 
much smaller number of pages actually get allocated. Since you know what 
movies you're playing back, you could potentially instrument things and 
use a much smaller buffer, but that might need to be updated as you add 
more movies, and would also require more extensive code changes to make 
sure you're using the right portions of the buffer (it is split into 64 
pieces, one for each zig-zag index).

> Changing
> //  _dec->dct_tokens=(unsigned char *)_ogg_malloc((64+64+1)*
> _dec->state.nfrags*sizeof(_dec->dct_tokens[0]));
>
> To
> unsigned char sambuf[MB(8)]; // global
>    _dec->dct_tokens=(unsigned char *)&sambuf[0];
>
> 'seems' to work, but not knowing the internals of theora, makes me nervous
> that I have broken something.

This will work fine. dct_tokens is only used while decoding a single 
frame. Its contents do not need to be remembered from frame to frame, so 
as long as you are not decoding two frames simultaneously using the same 
"sambuf" buffer, you should have no problem.

If it were me, I would check that MB(8) was at least as large as 
(64+64+1)*_dec->state.nfrags*sizeof(_dec->dct_tokens[0]), however.


More information about the theora-dev mailing list