[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Bandwidth limitations



>Clark Reynard <[email protected]> said:
>Perry writes:
>
>[Elegant refutation of all examples I give in original article.]
>
>Perhaps some true bandwidth stretchers:
>
>Complete maps of all the known universe, with spectrographic assays,

It occurs to me that Perry will refute my attempted refutation of his
refutation by pointing out that even a factor of 30,000 in video
won't saturate theoretical fiber limits, and that he may consider
your examples too fanciful.

But part of what we're talking about is just timing, even Perry said so.
We cannot yet modulate fiber at its theoretical limits. To do that we'll
need optical frequency sub-band frequency modulation, and that hasn't
even been achieved in the lab yet. (FM tuneable dye can't be modulated
at optical rates. Semiconductor "variable frequency" modulated lasers
can only switch between discrete frequencies.)

Some think it will take twenty years to achieve this. But I'm optimistic
and hoping for 5 to 10 years. (Unsure about commercial deployment, but
let's say it is fast and can use existing fibers, if not trunk equipment.)

So if you're "realistic" about when we'll be able to achieve fiber
saturation modulation, we're also far enough into the future that it
gets easier to see that we may have completely novel demands on information
transmission by then, and that existing demands will continue to cause
problems for existing fiber technology.

Conversely if one is optimistic about achieving theoretical limits on
fiber, then the fine points of the argument begin to be relevant. That
factor of 30,000 for video won't be enough to fill the fiber. Receiving
every global TV and Internet (ultra high quality) video transmission
simultaneously (to record and allow later channel switching) might do
it, but I have to admit that it seems chancy.

So it all comes down to the time frame in which the theoretical limits
are achieved.

Unless one gets speculative...for instance, nanotechonology scan-transmit-
and-rebuild could easily more than saturate even a large number of fibers.

Or slightly less blue sky: if your computer is an array of 10,000 optical
computers each operating at 100 gigahertz, and doing a distributed computation
with other systems over the net. (In this case networks are *always*
the bottleneck.)

Anyway the whole subject seems debatable and a matter of which numbers
one cares to predict for which future year. But we all agree that it's
merely a question of *when* fiber runs out of steam, not whether.
	Doug