[Theora-dev] Encoding paramaters...

David Kuehling dvdkhlng
Sat Jul 3 09:09:14 PDT 2004

>>>>> "illiminable" == illiminable  <ogg at illiminable.com> writes:

> These are the defaults i'm using...

> mTheoraInfo.target_bitrate=400000; mTheoraInfo.quality=30;
> mTheoraInfo.dropframes_p=0; mTheoraInfo.quick_p=1;
> mTheoraInfo.keyframe_auto_p=1; mTheoraInfo.keyframe_frequency=64;
> mTheoraInfo.keyframe_frequency_force=64;
> mTheoraInfo.keyframe_data_target_bitrate=mTheoraInfo.target_bitrate*1.5;
> mTheoraInfo.keyframe_auto_threshold=80;
> mTheoraInfo.keyframe_mindistance=8; mTheoraInfo.noise_sensitivity=1;

There's another parameter, that's also not used in encoder_example:
sharpness.  I played around with it a litte (by hacking
encoder_example.c) and it seems, that it's reverse.  Valid values are 0,
1 and 2.  If set to zero (the default), images are sharpest, but there
are artifacts (ringing?) around sharp edges, which seems to be a problem
for encoding high-quality anime (with lots of fine, sharp lines).

I set this to `2' now, and the result for "Chihiro" looks much better.
I get an artifact-free video with resolution 640x448 at quality `-v 5'.
After some tests, the average bitrate seems to be less than 550 kbps.

Note that I also set keyframe_frequency_force to 512.  BTW from a look
at `toplevel.c' it is somewhat unclear to me, how `keyframe_frequency'
is interpreted, when `keyframe_auto_p' is 1.  In that case
`keyframe_frequency_force' is used for keyframe generation, and
`keyframe_frequency' is only used for some kind of bitrate estimation.

Note that for a given quality, high values of `sharpness' also make
bitrate drop by quite some amount, so that for the same bitrate, an
actually higher quality should be selected.

GnuPG public key: http://user.cs.tu-berlin.de/~dvdkhlng/dk.gpg
Fingerprint: B17A DC95 D293 657B 4205  D016 7DEF 5323 C174 7D40

More information about the Theora-dev mailing list