[Theora-dev] decoder init/clear
conrad at metadecks.org
Fri Aug 5 02:18:42 PDT 2005
Enabling the following function in tests/noop.c:
theora_decode_init (&th, &ti);
The cause is that theora_decode_init() expects a theora_info structure
which was previously initialized by passing actual bitstream data
through theora_decode_header(), as documented in theora.h:
* Initialize a theora_state handle for decoding.
* \param th The theora_state handle to initialize.
* \param c A theora_info struct filled with the desired decoding parameters.
* This is of course usually obtained from a previous call to
* \retval 0 Success
int theora_decode_init(theora_state *th, theora_info *c);
Ok, so the test is buggy, it isn't providing the expected data.
Wait a minute. libtheora, and thus applications linking to it, shouldn't
crash if they are passed bad data -- or in this case, no data at all.
Perhaps the sequence of functions in noop_test_decode() is reasonable for
an application that sets up a decoder, loses its data connection then
tries to clean up.
So, I think a baseline test like this noop one is useful in order to
check that setup and teardown of the data structures works reliably. Once
we know that the setup works reliably, we can narrow down problems in the
Thoughts? Do we fix the test or the library? ;-)
btw. The crash is in theora_decode_init(). Commenting out the call to
theora_info_init() adds a sequence of awesome calls like this to the
output of valgrind:
==28684== Warning: silly arg (-1645612232) to malloc()
More information about the Theora-dev