Hi, I've been debugging some seeking issues with my implementation of Ogg Vorbis and found something curious in how buffer handling seems to work. Specifically what I found is that if I do seek with page granularity and end up on the first page, Ogg decides it needs to re-allocate a bunch of data buffers and references rather than get them from the pool. Whether or not they are still available is a bit of a mystery to me (I find the Ogg code to be pretty confusing honestly). This is a serious issue as I change Ogg Vorbis to use static rather than dynamic allocation and if we end up in a situation where someone tries to repeatedly seek to the beginning of the file, we'll have buffer overflows. This should also be a concern for anyone running with dynamic allocation as well, as your heap will quickly disappear since Ogg doesn't seem to try very hard to reclaim memory.<br>
<br>Anyways, the culprit appears to be the following code inside ov_pcm_seek_page():<br><br> result=ogg_stream_packetpeek(vf,vf->os,&op);<br> if(result==0)<br> {<br> /* <br>
* !!! the packet finishing this page originated on a<br> * preceeding page. Keep fetching previous pages until we<br> * get one with a granulepos or without the 'continued' flag<br>
* set. Then just use raw_seek for simplicity.<br> */<br> <br> _seek_helper(vf,best);<br> <br> while(1)<br> {<br>
result=_get_prev_page(vf,&og);<br> if(result<0)<br> {<br> goto seek_error;<br> }<br> if(ogg_page_granulepos(&og)>-1 ||<br>
!ogg_page_continued(&og))<br> {<br> return ov_raw_seek(vf,result);<br> }<br> vf->offset=result;<br>
}<br> }<br><br>More specifically, it's the call to ov_raw_seek which then eventually calls _get_next_page() where a bunch of new buffers and references are created (if you watch oy->bufferpool->outstanding, it grows very quickly with this call). I don't think I ever hit this control path if the seek doesn't land on the first page. Anyways, my inelegant and ignorant solution is to just comment this part of the code out. I ran some seeking tests and it doesn't appear to be necessary (100 random seeks to all sorts of locations and I'm still bit-exact with the reference code), however I really don't understand the comment so I can't say if removing it will cause serious problems.<br>
<br>Anyone out there have some additional information on this? Can I safely remove this code? If I lose some seeking accuracy that's OK, if I'm going to cause the algorithm to crash in a really subtle way that's definitely not OK.<br>
<br>Thanks!<br><br>Ethan<br>