[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: questions about bits and bytes [NOISE]



At 5:20 AM 4/12/96, [email protected] wrote:
>In a message dated 96-04-11 20:26:44 EDT, [email protected] writes:
>
>>[I told myself I was going to stay out of this, but Jim Bell's dogmatic
>>stance irks me... ]  Here's a citation from "Portability of C Programs
>>and the Unix System" by S.C. Johnson and D.M. Ritchie (yes, that Richie)
>>in the Bell System Technical Journal volume 57, Number 6, July-August 1978.
>
>Citing sources from 1978 in the computing field is a little like using
>dictionaries from the 1800's to dictate modern English usage.  My desktop
...

I've been ignoring most of these quibbles about the definition of "byte"
and when it came about, etc., but the debate never seems to end.

I went to the Jargon File (aka The Hacker's Dictionary), where a nice
online version resides at http://beast.cc.emory.edu/Jargon30/JARGON.HTML


This is what I found:


byte


: /bi:t/ [techspeak] n. A unit of memory or data equal to the amount used
to represent one character; on modern architectures
this is usually 8 bits, but may be 9 on 36-bit machines. Some older
architectures used `byte' for quantities of 6 or 7 bits, and
the PDP-10 supported `bytes' that were actually bitfields of 1 to 36 bits!
These usages are now obsolete, and even 9-bit bytes
have become rare in the general trend toward power-of-2 word sizes.

Historical note: The term was coined by Werner Buchholz in 1956 during the
early design phase for the IBM Stretch computer;
originally it was described as 1 to 6 bits (typical I/O equipment of the
period used 6-bit chunks of information). The move to an
8-bit byte happened in late 1956, and this size was later adopted and
promulgated as a standard by the System/360. The word
was coined by mutating the word `bite' so it would not be accidentally
misspelled as bit. See also nybble.