[theora-dev] pre 1.0 API issues.
Timothy B. Terriberry
tterribe at email.unc.edu
Thu Nov 19 11:26:26 PST 2009
Romain Beauxis wrote:
> Since the transcoding example in the binding works well, I am suspecting some
> sort of issue in the encoding parameters...
Well, one thing I notice is that create_encoder initializes an encoder
of size video_x × video_y (rounded up to a multiple of 16, as required),
but RGB.create_yuv is passed width and height (not rounded up to a
multiple of 16), which is also what theora_yuv is initialized with. The
old pre-1.0 API required a complete buffer, padded out to a multiple of
16 by the caller. The 1.0.2 encoder validated the luma plane's size, and
1.1.1 validates all of the planes' sizes, so I don't think this is the
problem (assuming the input video in your example really was 320×240),
but it is _a_ problem.
As an aside, the new API guarantees it will never read outside of the
picture region, so you don't really need to pad the buffer, but you do
need to play pointer games as _if_ you had a padded buffer (unfortunate,
but it made it easier for existing code to be ported to the new API).
See examples/encoder_example.c in the libtheora source for details.
I should also point out that the method used to end the stream in
ocaml_theora_encode_eos() actually inserts an extra frame into the
stream (which gets rendered as a duplicate of the previous frame). In
addition the granule position assigned to it will be wrong if you
happened to hit the maximum keyframe interval, producing an invalid
stream (it may play straight through, but will certainly have its length
mis-reported by anything that tries to calculate it, and seeking near
the end of the stream may behave oddly). The _correct_ way end a stream
after you've already encoded all of the packets is to add a page with no
packets, but with the e_o_s bit set on the page. It should use the same
granule position as the last packet in the stream. This uses a few more
bytes than just simply setting the e_o_s bit on the last packet itself,
when it is generated.
Now, I don't know ocaml, but the one line that does strike me as very
suspicious is:
http://savonet.rastageeks.org/browser/trunk/liquidsoap/src/ogg_formats/theora_encoder.ml?rev=7022#L48
quality = quality
The transcoder example has, instead
http://savonet.rastageeks.org/browser/trunk/ocaml-theora/examples/thtranscode.ml#L114
quality = !quality
which I assume means it takes the second quality from an outer scope. If
the former is really assigning the variable to itself, then it will be
the default (zero), and if you're not specifying a rate, you could very
well get the behavior you described from the 1.1.1 encoder. I don't
recall what the 1.0.2 encoder would do if you set both quality and
bitrate to zero, but I suspect that it would have continued to produce a
recognizable picture (albeit of fairly poor quality). The new encoder,
on the other hand, will drop frames and generally produce something
horrible, in an attempt to give you what you asked for.
More information about the theora-dev
mailing list