[Flac-dev] Is there any way to tell what level the flac was encoded at?
Pierre-Yves Thoulon
pierre-yves.thoulon at centraliens.net
Wed Mar 23 22:44:19 PDT 2011
I would also add that the "level" is only relevant to the reference encoder,
as it is an alias to quickly set a number of parameters. Other encoders
might make different choices.
Also, I haven't looked in details at how the encoder works, but I think it
is based on heuristics to decide which is the best compression method for a
set of samples, and I'm not sure that it would give twice the same results
for encoding the same file, although those results would be statistically
similar...
--
Pierre-Yves Thoulon
On Wed, Mar 23, 2011 at 22:33, Brian Willoughby - brianw at sounds.wa.com
<+flac-dev+pyt+81a7216403.brianw#sounds.wa.com at spamgourmet.com> wrote:
> Hjalmar,
>
> I recall that many hardware decoders have a limit on the level that
> they can handle. At the very least, hardware encoders may only
> support a subset of the compression levels. As for the decoders, I
> cannot remember specifically whether they are all capable of decoding
> all compression levels (it certainly would be more convenient).
>
> My memory from discussions on this mailing list is that the file
> format does not have anywhere to store the compression level. The
> reason is that compression levels are just human conveniences, and
> the actual compression is much more complex than a single digit number.
>
> Look at the command-line options for flac and you'll see that each
> compression level is synonymous with a group of up to five separate
> parameters. Maximum LPC Order, blocksize, (adaptive) mid-side
> coding, Rice partition order, and an exhaustive model search. I
> would assume that the factor affecting your decoder mips is the
> model, more than anything else which may be contributing. But you
> might have some luck varying those individual parameters instead of
> the basic compression level to see if any of them have a direct
> effect on your decoding complexity. If one of them does have a
> direct effect, then it might be easier to correlate something in the
> file headers with your clock speed.
>
> Perhaps you could just monitor your CPU usage and adapt the clock
> speed. Starting at the highest clock speed (in order to guarantee
> real-time decoding) you could measure percent CPU usage and drop the
> clock speed any time you're using less than half the available time.
>
> Another possibility is that your clock speed may depend almost
> entirely upon the sample rate, not the compression level. It's
> difficult for me to tell for certain from your cited examples, but
> while it seems clear that the sample rate has a effect on the clock
> speed needed, I'm not nearly so sure that different compression
> levels for the same sample rate would vary so widely. Perhaps you
> can just look in the FLAC headers for the sample rate, and then set
> your clock speed based on that, unless you're saying that a 44.1 kHz
> level 8 file takes more processing than a 96 kHz level 1 file, or
> that a 96 kHz level 8 file takes more processing than a 192 kHz level
> 1 file. You did imply that 192 kHz takes almost twice the CPU at
> level 8 versus level 1, but I would point out that sometimes you can
> only adjust your clock speed in large increments, so it may not
> actually be possible to drop your clock speed by less than half,
> especially not at the highest clock speed (with the way clock
> dividers work).
>
> Brian Willoughby
> Sound Consulting
>
> P.S. What is your embedded platform environment?
>
>
> On Mar 23, 2011, at 10:10, hjalmar nilsson wrote:
> > I'm developing a flac decoder in an embedded environment. I have it
> > fully up and running but I am trying to optimise the performance
> > vs. the power it takes to decode the stream.
> >
> > Using the reference decoder code with a few optimisations for the
> > hw I'm on I experience quite a difference in mips depending on the
> > level.
> > For instance, on a 192kHz/24 bit file the difference is almost that
> > it takes twice as many mips on level 8 vs. level 1.
> >
> > Unfortunately I can't tell our customer that they need to decode at
> > a certain level, BUT I want to minimize the frequency the processor
> > is running on.
> > Since I have no problem reading a 441/16 from a memory stick and
> > decode it on 40 MHz, but I require almost 240 MHz on a 192/24 file,
> > I would like to dynamically change the frequency depending on the
> > file I'm about to decode.
> >
> > Now, since you managed to plough through all this, I would really
> > appreciate if you could answer one simple question:
> > Is there any way I can read out the compression level from the
> > encoding process from the file?
>
>
> _______________________________________________
> Flac-dev mailing list
> Flac-dev at xiph.org
> http://lists.xiph.org/mailman/listinfo/flac-dev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.xiph.org/pipermail/flac-dev/attachments/20110324/8cfe56d8/attachment.htm
More information about the Flac-dev
mailing list