[ogg-dev] [PATCH] Fix a couple of problems for compilers where int is 16-bits

Michael Crawford mdcrawford at gmail.com
Thu Feb 25 04:37:19 PST 2010


On Fri, Dec 18, 2009 at 3:20 AM, Jon Beniston <jon at beniston.com> wrote:
> The attached patch changes occurrences of serialno to use the type
> ogg_int32_t, rather than int, as int can be too small on targets where it is
> only 16-bits.

In the early days of the Classic Macintosh, there was much wailing and
gnashing of teeth, because Apple's Macintosh Programmer's Workshop had
32-bit ints, while LightspeedC - later ThinkC - had 16-bit ints.

Both compilers built code for the exact same Mac OS API; it's just
that the Mac System API did not include any ints.

Apple's argument was that the 68000 had 32-bit registers.  Think's
argument was that while the 68k did indeed have 32-bit registers, it
only had a 16-bit data bus, so two memory access cycles were required
to transfer an Apple int to or from main memory, but only one for a
Think int.

The early Mac was profoundly memory constrained - cramming a full
graphic GUI into 128 kb of RAM - and so filled with all manner of
hacks that enabled more efficient use of what little memory was
available.  So Think added an additional argument, that if one were to
have a lot of ints around in a program's memory, Think ints required
half that of an Apple int.

To this day, I still meet old-time Mac programmers that sternly warn
one never, ever to use an int, not for any purpose whatsoever.
Instead they advise that every integral variable use a typedef for an
explicit number of bits.

Mike
-- 
Michael David Crawford
mdcrawford at gmail dot com

   GoingWare's Bag of Programming Tricks
      http://www.goingware.com/tips/


More information about the ogg-dev mailing list