[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

myths of software "standards" (long)






this is an essay on some ideas that have been swimming around in
my head about "software standards" and the various pervasive
myths associated with them, unleashed on this motley crowd
for your viewing pleasure or distaste.

there is tremendous amount of angst and anguish spent on software
standards by designers. I'm going to try to point out some of the ways that
software standards are fundamentally different from hardware standards
and the implications this has for their use.

with a hardware standard, we are talking about "atom configurations".
if a given computer is manufactured in a given way, there are zillions of
standards that are implicit in the design. there is a standard in
the way that cards interface to the bus, in the ways that chips fit
in sockets, in screw sizes, component sizes (such as the power supply), etc.

obviously it is virtually impossible to refit a component that does
not adhere to some "physical configuration standard". if the power
supply does not have the right footprint, good luck filing down the
edges to the point that it fits. <g>

another point to make about physical components is that it is possible
to "own" them. in our system, we are even allowed ownership of the
abstract standards used to design their atomic configurations; we
call these things "patents". 

however, notice that software standards are sometimes thought of in
the same way as hardware standards. I want to make a point about how
fundamentally different they really are.

==

software standards ultimately govern the "configuration of bits". to
borrow Negroponte's lovely distinction, bits are far different than
atoms. foremost of the difference is what might be called
the fundamental "malleability" or "fluidity" of bits in contrast to
atoms. atoms are expensive to move around and to manipulate. bits
can be moved around and manipulated at such an infinitesmal cost
as to be almost free.

here is the chief myth that I want to address, and I'm going to
borrow web concepts here especially, because the public is misapplying
the concepts of hardware and software standards especially in this
area.

now suppose that Netscape comes up with their own unique HTML "extensions"
which they have done. if you want to think of this "standard" in the
old paradigm of atoms that are expensive to move around etc., then
what netscape has done is pretty outrageous. one might assert, as
the press and public tend to do, that Netscape is trying to "own" 
the standard and "impose" it on the rest of the world, so that they
can "control" it.

but what is this "imposition"? if they were saying that particular
hardware components had to be designed in their way, indeed this would
be an onerous and suspicious demand. it would be reminiscent of 
industry outrages like IBM's "microchannel" architecture.

but the fundamental distinction here is that Netscape is *not* designing
a standard that refers to atoms, but one that refers to bits. and
standards that refer to bits have fundamentally different properties.
first of all, if they are good standards, then it should be easy
to manipulate the bits between the different standards, and the cost
of doing so should be close to negligible. we are only talking about
straightforward algorithms easily implemented by even 1st year
CS students, typically. 

when you think about it, the concept that a company can "own" a 
software standard in the way that hardware configurations are "owned"
is pretty obtuse and incongruous. because bits are so readily converted
and manipulated, it actually becomes the case that companies that
create bit standards are almost doing a public service in devising
orderly systems of bit arrangements not previously established. if
bits are interchangeable, then the key is to get them into an orderly
form first, and then just twiddle them into the format that you want.

==

all this sounds a bit vague and nebulous 
but is extremely significant. it demonstrates
how radically different the information revolution is from the
industrial revolution.  in the information realm, "interchangeable
parts" takes on a whole new meaning. all that is necessary is that
the bits be in some standard form to start, and then they can easily be
transformed into some other form. 

a very important point to make is this: what becomes more valuable with bits i
s *not* that everyone pick and agree on *the*same*standard*. this is
applying "atom" type prejudice to a new problem. if everyone wants to
have compatible hardware, then indeed we need to have the kinds of standards I
described. but software (bit) standards work differently. you only want
to have *any* kind of a standard that is well designed. you want standards
that are not necessarily *universal* as with atoms, but instead are *orderly*.
if they are *orderly* or "well designed", then it should be easy to convert
any "bit configuration" standard to any other standard on the fly with
algorithms.

there is tremendous ranting and raving in the Web world about how
the HTML standard is fragmenting because of Netscape etc., and there
is so much angst about trying to devise a *single* cohesive, unified
standard that "everyone" follows. people talk as if Netscape is
trying to "hijack" the standard, when in my opinion they are performing
a valuable public service of trying to hammer the bits into useful
form. everything they have proposed could not be handled by the
earlier standards-- and if it could have been, chances are they would
have used that standard.

a unified standard in software realms is a total fantasy to achieve, 
and in my opinion the dramatically wrong & specious goal.

==

instead, I taking into account the above ideas, what we need are a
*variety* of different standards, all of them in themselves cohesive
and fully functional, which can be *translated* readily between each
other. the key goal is not *unified* standards that try to entail
"everything", but instead collections of complementary standards
that are in themselves nice unified "pieces" of the whole. (somewhat
like original Unix design philosophy).

the various image formats such as DVI, postscript, TIFF, etc. is an
example of this. they all are decent standards for what they
attempt to standardize, and it is silly to lament that there isn't
a single image standard-- it misses the point.

(one tricky thing with bit standards is the goal of trying to
go "backward" in converting a very complex format into a simpler
format. trying to have text-based web browsers with all the complex
images and formatting out there is an example of this.)

in other words, some people seem to imagine that in the future some
massive HTML language is going to be devised that all browsers support.
many web design discussions seem to implicitly talk as if this is
the goal.

instead what I imagine is that many different substandards will 
be devised, and will *continue* to be devised-- the point when there
is a global, unified "web formatting language" will *never* come and this
is an illusionary, impossible, and *unnecessary* goal. what
we need are browsers that are extremely flexible and can support
on-the-fly translation between different formats, and which try to 
support the capability that at any time in the future, someone may come up
with a new language that could drive browser formatting and display
characteristics. the idea of having different layers over the network, 
such as "conversion servers" which might convert between all the 
more common formats and requests, is another interesting idea to pursue.
netscape 2.0 "plugins" are a first step in this direction.

imho, the web of the future is going to have not one but a *zillion* different
languages describing all of the data that is out there. the goal should
not be unification under a single standard, but of ease of conversion
between existing standards that are modular and complete in themselves.
in this view, the complementing (not competition) of different formatting
languages is glorious and to be encouraged, not something to be dreaded,
avoided, and stamped out. the diversity and "complementarity" 
is the key to the power.

==

I've been making all my points relative to the Web, but I think the
ideas apply equally well to *computer*languages*. there are all kinds
of silly holy wars fought over about what are the *best* computer
languages, and everyone that designs a new language seems to be
implicitly trying to incorporate the features of every other language
in existence and then some, i.e. a new "unified" or "complete" or
"ultimate" language (I recall a long flamewar out in the newsgroups
between Stallman, espousing Lisp, and Wall Perl fanatics). 
to me this is all ridiculous, because in the
future the goal will be the ability to *convert* between languages
in automated ways, such that the same problem can be automatically
reformulated in another form to gain its particular idiosyncrasies.
imho, new computer languages are going to be invented as
long as human beings exist-- because what they really are is a
"component library".

for example, C is very low level but fast-- why can't I just convert
my Perl code directly into C whenever I want to? or vice versa?
in fact that is exactly what a compiler does, and I am suggesting
that the compilers of the future will allow conversions between
all kinds of languages, not merely a high level language to machine
code. in this sense the idea of fighting over different languages
as "ultimate" is ridiculous as the religious wars over who is the
"one true god"!! all algorithms are in principle interchangeable, and
I believe this theoretical concept will be increasingly applied
directly in the future.

==

anyway, this is my contribution-of-the-moment in trying to dispel
some of the "standards myths" that are extremely persistent out
there esp. in regard to Web software and language extensions.