[vorbis-dev] Monty on holiday
Willmore, David VS Central
WILD4 at aerial1.com
Tue Jun 6 13:18:50 PDT 2000
> On Tue, 6 Jun 2000, Willmore, David (VS Central) wrote:
>
> > Okay, before I say anything too stupid. Could someone point me to a
> > reference for how the current VQ scheme works. (*mumble* or maybe what
> VQ
> > even means.....)
>
Gregory Maxwell replied:
> Quantazation (at least in this context) would be the process of
> representing information more corsely then orignal. I.e. 1.435 could be
> quantized to 1. In this process, information is typically lost.
>
Quantazation for the sake of compression, then. Since the values are
already quantized to some extent by their very nature of existing in a
computer, that is. You want to further quantize them. Okay.
> Vector quantization is simmlar but insted of one value, we operate on
> agroup of values.
>
> Quantazation can be performed using a codebook. This is where you have a
> table of values, and you return an index to the closest match. I.e.
>
> If you have a dataset of
>
> 1.5 2.5
> 2.5 6.0
> 2.0 6.0
> 2.0 2.5
>
> and a codebook of
>
> 1.75 2.5
> 2.25 6.0
>
>
> You would then code the data
>
> 0
> 1
> 1
> 0
>
What is your 'closeness' function? Linear, log, closest-without-going-over?
I don't appear to understand your example. You take a 4x2 table and make it
4x1 *and* quantize it. So, you have to find the row in the codebook that
matches the row in the dataset as closely as possible (closeness as defined
above)? I follow.
> In vorbis, we also have the ability to 'cascade books', i.e. take the
> error from one set and emit more correction words (either multiplicative
> or additive).
>
Good, this allows us to partition our data flexably between A, B, C, etc.
... bits.
> The codebooks are packed at the begining of the ogg file, and could differ
> for differnt songs.
>
Okay, that can be coped with. Make the codebooks for each section of bits
encode with the same relative priority as the bits (plus bonus for being
unique).
> The codeword legnth is variable, and is the result of a huffman tree
> created by counting hits against a test set (i.e. if all codebook entries
> are equaly likely, they will have equal codeword legnths.
>
Is the goal to minimize the global error? Why then, don't we just histo the
whole file (if we have it a priori) and then break it into equal probability
bins and quantize into those (wouldn't want nearest match, then)? Oww,
brain just started hurting, how do you do a histo on a vector? Owwww...
Maybe one of the volumetric methods that are used for RGB quantization might
help.
> The codebook used to store the noise floor LSPs is trained to minimize
> global error while the residue books are trained with a much simpler
> scheme (because the other method did not preserve uncommon features).
>
Uncommon features?
> The LSP output looks like
>
> 0.012 0.234 0.543 0.7328 0.9243 1.0234 1.235 1.5234
> (Always increasing; It's a property of LSP)
>
> The current vector codebooks are four wide (I believe), while long block
> LSPs are 32 vaules wide. The LSP is broken into subvectors for encoding,
> each the same legnith as the vector codebook input.
>
> The value of the last entry is subtracted from all the members of the next
> word.. I.e. the above becomes
>
> 0.012 0.234 0.543 0.7328
> .1915 .2906 .5022 .7906
>
> This is what the codebook is trained against and encodes. This allows you
> to keep the codebook size small but still accuratly represent the data.
>
Can you characterize the LSP vectors for me a bit more? I don't know much
about the properties they signify. Are they a curve or why are they
increasing?
> The residue codebook is a bit differnt. It uses a creative
> Amplitude&Entropy metric to segment the residue into 64 entry groups which
> are encoded by differnt books depending on their amplitude and entropy.
>
All of this is hardcoded into the decoder logic?
> Enough information? :P
>
(immitates plant from Little Shop of Horrors) Feed me, Seymour!
Thanks for the information. Is there a good document describing of would
that 'document' be the code, itself? :)
Cheers,
David
--- >8 ----
List archives: http://www.xiph.org/archives/
Ogg project homepage: http://www.xiph.org/ogg/
More information about the Vorbis-dev
mailing list