[vorbis-dev] Parallelism

Gregory Maxwell greg at linuxpower.cx
Fri Aug 18 18:04:09 PDT 2000



On Fri, 18 Aug 2000, Jeff Squyres wrote:

> I disagree (and we may well have to agree to disagree).

Agreed. Now, go ahead and write the code. 
 
> Most users are lazy.  I'd be willing to bet that the majority of users
> would rather have to do as little as possible to get the results faster
> and better.  More power! [Tim Allen grunts]  If a user can find a
> multi-threaded or otherwise parallel program that will encode their CDs
> faster, they'll use that rather than having to kludge up pmake and/or some
> scripts, or have multiple windows open on a screen.

I don't see the reason to expend this much effort (i.e. the
discussion) over something that less then %0.001 of the user base will be
able to use (how many mass vorbis encoders have a MPI cluster?) 

> And what about the encode-a-single-CD scenario?  If you could encode the
> entire CD while spanning multiple CPUs with single program (no scripts, no
> pmake, no multiple windows, etc.), wouldn't you prefer that?  Most users
> are not developers -- they want a minimum of fuss, nor would they even
> understand how to do any of that stuff, either.

It sounds like it would be a nice feature for a flashy 'grip' like
program. Wait a sec, grip already does it (for normal CD's with track
granularity). Again, if more is needed I'm sure someone will just code
it. Until then it's a waste to talk about it.

> No one has even mentioned the PR value of all this.  Consider how fast a
> multi-threaded encoder will be if you run it on a 4 way SMP as compared to
> any serial MP3 encoder out there (yes, apples and oranges, but since when

I've heard that someone already has a custom ASIC to perform vorbis. If
this is the case then it would probably make much better press, and be
just as inapplicable to most users.

> has the press cared? ;-).  Or mixing another buzzword -- Beowulf -- with
> vorbis.  Vorbis has a lot to compete with in MP3, and a lot of that has to
> do with public perception (regardless of the facts).  If vorbis has
> *blazing fast* encoders (and who cares why or how they work), that's a big
> PR checkmark IMHO.

If you want to get into buzzwords an PR garbage, get Mark Taylor (of
the lame project) to get some CPU time on ASCII white to crunch out the
production codebooks. (VORBIS, tuned by the fastest computer ever made)

> CDDB entries are not a limiting factor -- bad entries can even be fixed
> after the fact (write a 10 line perl script).  Indeed, the catagorization

Wow. Thats some awesome AI in 10 lines.. Lets see, it would have to, be
able to see the cover/linernotes of a disk long after ripping, OCR the
characters, determine what information is useful (album, track ,etc) and
correctly attach it to the right tracks..

That would be quite an amazing piece of code. I guess you should pursue
the parallel vorbis, it should be a sleeper compared to that AI CDDB name
changer.

> of the resulting encoded files is orthogonal to the actual encoding
> process.  It can be as time consuming as the user wants (if you want
> perfection, it will take a lot of time, if you take just the raw CDDB
> output -- buggy as it may be -- it takes no time at all).  So saying that
> categorizing your encoded files won't be sped up with parallelism is
> meaningless.

Producing a finished product can be viewed as a singular process. It's not
done if the tracks aren't named. If naming the tracks is the slowest step,
then it's the limiting factor on the whole process.

> Ripping can be *much* faster than encoding.  Logically speaking, you have
> a producer (or producers) and a consumer (or consumers).  The ripper(s)
> produce the input, and the encoder(s) consume the input.  Since one side
> is clearly faster than the other, I don't see why trying to speed up the
> other side is a Bad Thing.

I disagree. The fastest commercially available ripping driver is the
Plextor Ultraplex. On my desktop I can encode (into MP3, there is no
strong reason to expect that vorbis will not be someday comparable to lame
in speed) faster then I can rip on that drive.

 
> Sidenote: one of the reasons that parallel programming was created for big
> number crunching.  And that's that encoding is.  Hence, encoding can
> benefit from parallelism.

The reason that jet engines became widely used in planes was because
internal combustion engines have detonation limited power. Turbine engines
have no issues with knock.

Modern economy cars could operate much more efficiently if there was no
potential for knock at high compression rations (more compression = more
efficiency). 

Does it follow that an economy car should be equip with a turbine engine?

> Yes, you can do "manual" parallelism, but a) I
> still maintain it's not as efficient, and b) it's clearly more
> user-intensive that way (more blinken-lights to monitor, etc.).

Actually the coarse granularity encoding is *MORE* efficient since there is
virtually no state to pass between nodes. It might not be as fast if you
are doing encoding with few tracks on many nodes, but it would always be
more efficient.

It's not any more user intensive if implemented quietly in a front end.

It would certainly be a lot less developer intensive.

Remember that Vorbis supports different block sizes. With all the windowing
concerns, a fast cluster version would be considerably harder to write and
maintain then you seem to believe.

> If you look at my previous posts (and one was a reply to you), I
> volunteered to look into this as well as contribute code (I was actually
> diving in vorbis code when your post arrived).  The whole point of an
> active development community is to discuss and talk about these things;
> why stifle creativity?

I don't intend to stifle it, just pointing out my perspective.

--- >8 ----
List archives:  http://www.xiph.org/archives/
Ogg project homepage: http://www.xiph.org/ogg/



More information about the Vorbis-dev mailing list