[theora] Bitrate expectations for encoding or decoding
jonas at jonasechterhoff.com
Tue Jan 23 02:34:21 PST 2007
I'm currently working on implementing Theora Video playback for the
Unity engine (http://www.unity3d.com). So far everything works fine,
But I have a small question:
Is there a reliable way to predict what bitrate a stream will have
when encoding or decoding?
Of course, there's the target_bitrate parameter in the Theora Info
structure, but somehow, I always get quite different bitrate results
for my encodings (typically about half as much). Is the quality
parameter ignored when target_bitrate is non-zero? I'd like to have
some idea of what I will get, so I can present the user with a file
size estimate before starting the encoding. What would your
suggestion be on how to do this.
Also, we support streaming Ogg files from web sources. Now, I'd like
to be able to know if we have enough data, so it would be ok to start
the movie given the current download percentage and download rates.
But the problem is I have no idea how to predict the actual playing
time of a movie. Unless I overlooked something, there doesn't seem to
be any relevant information in the Ogg headers (as it's a streaming
format). As I know the file size, i could estimate the total playing
duration if I knew the bitrate, but as stated above, the
target_bitrate field is not really meaningful (and it may be zero, if
quality is used).
So, my question is: is there any reliable way to predict what bitrate
a theora stream will have before encoding or decoding?
More information about the theora