[Flac] repost of correct response... flac -- exhaustive model search vs. -A <*>?
Linda A. Walsh
flac at tlinx.org
Wed Jun 13 15:16:38 PDT 2012
I just noticed that a bad copy of this went to the list (bad in that it
didn't include my comments to Brian...*oops*) -- rectified..
Brian Willoughby wrote:
> On Jun 10, 2012, at 21:32, Linda A. Walsh wrote:
>
>> what does the exhaustive model search do?
>>
>> Does it try all of the functions listed under "-A" to find the 'best',
>> i.e. bartlett, bartlett_hann, blackman, blackman_har-
>> ris_4term_92db, connes, flattop, gauss(STDDEV), hamming, hann,
>> kaiser_bessel, nuttall, rectangle, triangle, tukey(P), welch.
>>
> A better question might be: "What do the -A options do?"
>
> All of those windowing functions are lossy, and are used for
> frequency domain transforms. I'm not sure how they would be used in
> a lossless encoder. Then again, I have not yet studied the entire
> mathematics behind FLAC.
>
-----------------
?!?! really?
Lossy?... I can't see how that would fit into a FLAC format?... (logically).
[I mean, how can one use those and still call yourself a 'Lossless' format?]
> As for your question, I've never used the exhaustive model search
> option, but I had originally assumed that it meant a search among the
> various compression level options. For example, -l -b -r -M and -m
> all are preset when using -0 through -9 and --fast and --best as if
> they were macros, but you can also manually set those options in
> different combinations. I initially thought that the exhaustive
> search went through the -l -b -r -M/-m options to find an adaptive
> "best" compression rather than a preset one.
>
----
Well, anyone who hasn't tried the exhaustive model search, should
-- it's
darn fast. I can't imagine any of the higher numbered options being much
faster.
[as collections (album/cd or multiples of each), VERY often are perfect for
parallel encoding (making most albums encodeable in well under a minute,
if not
<10-15 seconds on a 2-physical-cpu, system (8-12 cores total).
> However, now that you've made me think about this in more detail, I
> tend to assume that the exhaustive model search has more to do with
> the LPC (linear predictive coding). The key to lossless compression
> is to find a model that predicts each sample from the previous
> samples, and the better the model the smaller the file. An
> exhaustive model search must go through all possible LPC models
> rather than the quicker default list.
>
> Anyway, my apologies for responding without an actual answer, but a
> conversation might be slightly more interesting than a quick answer
>
> (I hope).
>
---
In absence of hard facts, theories always tend to come first...
It's more scientific sounding than the ones with hordes of daemons
exhaustively trying every answer... ;-)
More information about the Flac
mailing list