[CELT-dev] right settings for highest quality
gmaxwell at gmail.com
Fri Aug 20 04:04:53 PDT 2010
On Fri, Aug 20, 2010 at 6:59 AM, Jochen Kilian <jochen.kilian at gmail.com> wrote:
> I am trying to evaluate the quality of the CELT codec by using the 0.8.0
> testcelt tool to encode and decode the input.
> I want to test different bitrates and selected the below parameters for 64,
> 96, 128, 196, 256kB:
> ./celt-0.8.0/libcelt/testcelt.exe 44100 2 256 46 $1.sw $1-64kb.sw
> ./celt-0.8.0/libcelt/testcelt.exe 44100 2 192 46 $1.sw $1-96kb.sw
> ./celt-0.8.0/libcelt/testcelt.exe 44100 2 128 46 $1.sw $1-128kb.sw
> ./celt-0.8.0/libcelt/testcelt.exe 44100 2 96 46 $1.sw $1-196kb.sw
> ./celt-0.8.0/libcelt/testcelt.exe 44100 2 64 46 $1.sw $1-256kb.sw
> can someone comment on the chosen parameters? I am not sure if these are
> right settings for frame size and bytes per packet. Is there a better
> Thank you very much in advance
The celtenc/celtdec tools in the tools directory are a little more
What you're changing in your above text is the frame duration (in
samples). This is probably not what you want.
If you don't care about latency, set the frame-size to 960 and use
bytes per packet to control your bit-rate. Set bytes per packet to
bits_per_second/(44100/960*8). ... Or just use celtenc which will do
this calculation for you.
More information about the celt-dev