[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

java: vending machine software (long)





Java seems to be catching on in a big way (only a few months ago,
the hook-line-and-sinker interest by MS would not have been conceivable)
and seems to be leading to some radical new programming and
cyberspace paradigms. I thought I would try to anticipate some
of these future developments. what does it mean that "the network
is the computer"?

there are three very important aspects of Java in my view that make
it much more than your everyday programming language, and all of
these could be classed as "revolutionary" if they stay airtight
and hold up to the rigors of worldwide keyboard banging. not all of these are 
recognized for their importance or potential future significance
at the moment. 

1. it can be easily translated across platforms. this translation
is far more robust and reliable than say the supposed portability
of "C" which is in fact not very portable and whose "portability" seems
largely an illusion at times. complicated makefiles with a zillion
special cases are far from seamless. Java has reached the level
of almost "foolproof portability". of course there are some minor
sacrifices (such as possibly) efficiency. however Java has always
been designed with high efficiency in mind.

2. the security aspects are extremely significant. the end design goal
of course is that the end user should never have to worry about whatever
software he downloads-- he can run it all knowing it is theoretically
impossible for it to screw up his system. all other schemes, such
as a "software certification authority" which e.g. MS has been
pursuing, seem inferior to some degree. 

3. the whole concept of an "applet" is again very critical to the
Java design. the name connotes something that is trivial to run, and
may be tiny. i.e. one might be running tiny little programs all day
instead of monster applications that we have right now. the philosophy
is of "lean and mean" vs. the massive dinosaur. one way to state it
would be to say that the applet is a program with "small granularity".

==

now, all the above elements are present in various languages and
platforms, but Java has focused on refining them to the utmost degree,
to the point that every nonprogrammer can use a Java application without
worrying about compilation or whatever. this goal of "increased sophistication
for the less sophisticated" has always been foremost in software
development, and alas has largely been lost in the minds of many 
designers. (look how many years it took MS to come up with the
concept of "plug and play" when it would seem like an obvious
feature from the very beginning of computing). the process of
complex program translation/porting
has been reduced to the minimum of clicking on a hypertext button.

 look how many countless
man-lifetimes have been wasted by beginners trying to hack through
their autoexec.bat, config.sys, and whatever else silly configuration
files the software requires them to tweak. look at the lifetimes wasted
on bad IRQ settings. what utter shamefulness, to think that all this
could have been averted by a few farsighted designers who took the
time to "do it right" the first time around and develop standards
that didn't require a burden on the human, but instead put it where
it belonged: on the designer!! those who argue, "yes, but as soon
as you learn this stuff its not a big deal" are completely missing
the point.

the point that I am getting to is that I think we are entering a new
era of "seamlessness" everywhere in cyberspace, and java is going to
be a big part in helping that goal be achieved. we are going to see
our machines rarely ever ask us for ridiculously arcane or abstruse
information such as IRQ settings, IP addresses, or require
us to maintain it--things we should not have to care about as humans.
its really amazing how often designers simply replace one set of 
problems with another set, and these people are increasingly going to
have to get their act together in the future as people simply don't
tolerate poorly designed software that requires them to do things
they shouldn't have to do.

==

Java reminds me in many ways of the Unix operating system. now I'm
not claiming a parallel in the "use" of the language, but the
design goals are somewhat similar. Unix was broken down into very
fine "pieces" of code that could be interchanged and plugged into
each other in the same way that subroutines or classes can be 
shared today. the obvious goal is to have the ability to string
pieces of code together, no matter how small or how large. this
goal has largely eluded software programmers. it seems that after
code gets to a certain size it becomes far less able to be used
as a module or subroutine to other code. it seems one has to
constantly invent new languages to string all this code together
(machine language, C, then shell scripts, then an operating system
on top of that, then a network).

 I believe we are moving toward an environment in which code
becomes incredibly interchangeable. the entire cyberspace will
be seen in the same way that we see our own local computer system
today. cyberspace will be thought of as an enormous code library
that one can "link to" in any way one likes with one's own code.
the distinctions between operating system, computer languages,
shell scripts etc. are all going to blur into one massive, 
unified algorithmic structure.

"object oriented programming" is slowly moving in this direction.
it does seem to be that the basic unit of software is not a
"subroutine" or an "application" but an "object" and that this
paradigm can be utilized in almost any situation, at any level.

Unix has what I describe as "fine granularity" and was designed
purposely and almost fanatically to have this feature. in short it
is the philosophy that no code is an island and no matter what 
hierarchy of code one is talking about, from subroutines all the
way up to complex applications, standards should exist that let
it all "interoperate" in automated fashions. 
 a shell script is merely a way of treating
large pieces of code as subroutines. (one can get their exit
status, pass parameters on a "command line" etc.)

Unix succeeded in bringing the "interchangeable code" concept up
higher in the hierarchy to shell scripts and OS utilities etc.--
but it had to keep inventing new languages at every level to
do so.

to say that Excel "interoperates" with some other software seems
deceptive, if one is using the term in the same way it was
used in Unix. the user has to click around in menus to accomplish
what they want; whereas the situation of making the software so
that it can be called as a subroutine from code requires an entirely 
different design. when Excel began allowing all its features to
be called from Visual Basic functions, such that a series of mouse
clicks and operations implied a particular Visual Basic program,
is the direction I am referring to.

today we see code as something hiding behind user interfaces. but 
increasingly in the future, we are going to be able to see the
code itself and view the user interface as simply a kind of
"grafting" on top of it. it is a sort of "handle" that lets one
access the code. there are other "handles" that can be created to
use the same code, such as a specialized language, subroutines that
name the code, or objects. generally objects are going to win out
in the future as the "thing" that describes all code. the object
paradigm does come very close to the goal of interchangeable software
parts. increasingly the objects that hide behind complex applications
are going to become "visible" to the end user who can combine them in
novel ways. the analogy I like to use is that the "hood of the car"
can be opened to let people to tinker with the engine.

==

when one thinks about this, I think it becomes clear that we are going
to see many, many new standards for code communication in the future.
if we don't want all these java applets to be isolated, we are likely
to see many standards emerge that allow people to write applets
that "fit into" various places in a sort of "plug and play" 
approach. in Unix, the standards that were devised were shell scripts
and command line parameters. much effort went into trying to deal
with compatibility of command line arguments and that kind of thing.

these standards will tend to define things like "the standard methods
that [x] java widgets must support, and in what ways". I expect
to see a lot of these standards be developed and proliferate.

in fact, musing on all this reminds me that it seems to me the
heart of computing involves creating standards and interfaces, 
and that very few computer languages address this aspect of
computation. I'm toying with the idea of inventing a computer language
that actually manages standards. (future posts on this will probably
go to coderpunks)

==

now a few words about something I talked about in the title, or 
"vending machine software". I imagine we are going to see a whole
new paradigm for software use in the future that is going to absolutely
baffle companies used to the old paradigm, and who built their
kingdoms on it, the most obvious being Microsoft.

the key concept is to combine cyberspace, digital cash, microcurrency,
applets, and interchangeability into a new complex holographic recipe. 
imagine in the future that massive single, "circumscribed" 
applications such as Excel
become more rare, and instead what develops is an incredible variety
and diversity of applets around the world.

 I suspect that in the
future, people will use software something like the way vending 
machines work. you look around, pick the exact thing you want in the moment,
and pay a pretty small fee. you may come back later and pick something
else out. eventually I think cyberspace is going to look like one massive
application that one can click anywhere to do anything. anyone will
be able to put their own "code" into this massive vending machine.
sophisticated methods of organizing the hierarchies to aid finding
what you want quickly will evolve, just as Yahoo and all the other
search services are now proliferating.

the point is that you no longer "buy" an application-- you pay for
each individual use of pieces that exist all over cyberspace, i.e.
every time you "call" a subroutine, so to speak. the cost-per
use is so low that you don't mind this, and in fact you probably save
a lot of money in the long run, because you only pay for what you use.
furthermore, the software is very specialized and you can get applications
that are very much tailored to what you want them to do-- they require
less and less configuration. entire companies will specialize in delivering
what you want very quickly if what they are selling is not exactly what
you want that moment.

people will in fact create massive applications that are strings of
subroutines of software written elsewhere all over the world. I think
that rapid network speeds will actually allow software to be written
that doesn't reside on a local computer, but in fact in which some
subroutine calls happen over a network!! the parameters and return
data are passed over the network, and the code never runs on a local
computer.

notice the "boundaries" of such an application seems to shift 
dramatically. you cannot "circumscribe" such an application as easily,
it is not one "thing" that runs isolated on your own computer. it
is a sort of holographic element in a massive algorithmic universe
that calls on all kinds of other elements in the universe.

it may be possible to build in the same kinds of "resistance to
errors" in this kind of computation that we now have in TCP/IP
protocols. i.e. if a certain module fails, the system may automatically
call up other modules that work.

all of this requires a reliability and complexity we do not have
at the moment, but I see major hints of it in our 
current system, and I believe we are
rapidly evolving towards the above scenarios.

the above cannot really be realized so long as people insist on
selling their code as if it is a massive product that has to be purchased
and shipped somewhere in shrinkwrapped packaging. as we begin to move
away from the concept of, "you are buying the right to use this program
whenever you want for a lot of money" to "you are buying this particular
computation or use of the algorithm for a small fraction of the development 
cost", the above system will begin to proliferate and blossom.

==

what does this all mean to existing (software) 
companies? increasingly, the value
of a company of people, or some kind of structure, will be how well
it can coordinate people and resources to accomplish some particular
goal. but the company will increasingly have to compete with other
structures that may be more able to coordinate resources efficiently.
if an incredible groupware program evolved, for example, that let 
people coordinate themselves and others as "efficiently" or more
so than a company does today, companies in the modern sense would
tend to die out. a "company" becomes a virtual collection of people
and resources to accomplish a common goal, but the geographic
localization/focus characteristic of modern companies will be seen
as something as an anachronism.

a company that is drowning in inefficient bureacracy will tend to
find that people will simply go elsewhere and find more efficient
methods of "plugging in" their value, because they are better paid
by some structure that does not waste their energies.

I am not saying above that bureacracy is evil-- we are going to find
out what kinds of bureacracy (or "coordination") 
is really necessary in the future, and
what kind is superfluous, burdensome, and inefficient, as people increasingly 
move out of/away from structures running amuck in the latter.