GENWiki

Premier IT Outsourcing and Support Services within the UK

User Tools

Site Tools


archive:internet:fibersph.ere

From: "gordon jacobson" gordon.jacobson@channel1.com Date: 16-Mar-93 00:46:12 Subj: into the fibersphere

Newsgroups: alt.dcom.telecom Organization: Channel 1 Communications

   I have seen references to the following article in this and other

newsgroups. I contacted the author and Forbes and as the preface below indicates obtained permission to post on the Internet. Please note that the preface must be included when cross posting this article to another newsgroup.

The following was received directly from George Gilder on Saturday March 6.


Date: Sat Mar 06, 1993 2:58 pm GMT From: George Gilder / MCI ID: 409-1174

TO: Gordon Jacobson Subject: PLEASE UPLOAD TO INTERNET

   The following article, INTO THE FIBERSPHERE, was first published in

slightly different and shorter form in Forbes ASAP, December 7, 1993. It is a portion of my book, Telecosm, which will be published next year by Simon & Schuster, as a sequel to Microcosm, published in 1989 and Life After Television published by Norton in 1992. Subsequent chapters of Telecosm will be serialized in Forbes ASAP beginning with the March issue containing a theory of wireless communications.

                PLEASE POST "FIBERSPHERE" TO ANY USENET
                NEWSGROUPS THAT MAY BE DEEMED SUITABLE.
                     THE COMING OF THE FIBERSPHERE
               In a world of dumb terminals and telephones,
               networks had to be smart.  But in a world of
                smart terminals, networks have to be dumb.
                                  BY
                             GEORGE GILDER
   Philip Hope, divisional vice president for engineering systems of EDS,

has an IQ problem. His chief client and owner, General Motors, wants to interconnect thousands of 3-D graphics and computer aided engineering (CAE) workstations with mainframes and supercomputers at Headquarters, with automated assembly equipment at factories in Lordstown, Indiana, and Detroit, with other powerful processors at their technical center in Warren, Michigan, with their Opel plant in Ruesselheim, Germany, and with their design center outside San Diego. On behalf of another client, Hope wants to link multimedia stations for remote diagnostics, X-ray analysis, and pharmaceutical modeling in hospitals and universities across the country.

   Any function involving 3-D graphics, CAE, supercomputer visualization,

lossless diagnostic imaging, and advanced medical simulations demands large bandwidth or communications power. Graphics workstations often operate screens with a million picture elements (pixels), and use progressive scanning at 60 frames or images a second. Each pixel may entail 24 bits of color. That adds up fast to billions of bits (gigabits) a second. And that's for last year's technology in a computer industry that is doubling its powers and cost effectiveness every year.

   What Hope needs is bandwidth and connections.  The leading bandwidth

and connections people have always been the telephone companies. But when Hope goes to the telephone companies, they want to tell him about intelligence: their Advanced Intelligent Network which will be coming on line over the next decade or so and will solve all his problems. For now, they have what they call DS-3 services available in many areas, operating T-3 lines at 45 megabits (million bits) a second. These facilities are ample for most computer uses and working together with several different Regional Bell Operating Companies (RBOCs), Hope should be able to acquire these services in time for a General Motors takeover by Toyota.

   Hope has been through this before.  In the early 1980s, he actually

wanted D-3 services. Then he was interconnecting facilities in Southeast, Michigan, with plants in Indiana and Ohio. But Michigan Bell could not supply the lines in time. EDS had to build a network of microwave towers to bear the 45 megabit traffic. Later in the decade, the phone companies have even offered him higher capacity fiber optic lines, with the requirement that the optical bits be slowed down and run periodically through an electronic interface so the telco could count the number of "equivalent channels" being used.

   What Hope and others in the systems integration business need is not

intelligent networks tomorrow but dumb bandwidth that they can deliver to their customers flexibly, cheaply, and now. To prepare for future demand, they want the network to use fiber optics. It so happens that America's telephone companies have some two million miles of mostly unused fiber lines in the ground today, kept as redundant capacity for future needs. Hope would like to be able to tap into this "dark fiber" for his own customers.

   As a leader in the rapidly expanding field of computer services, EDS

epitomizes the needs of an information economy. With a backlog of 22 billion dollars in already contracted business, EDS is currently a seven billion dollar company growing revenues at an annual rate of 15 percent, some three times as fast as the phone companies. EDS will add a billion dollars or so in new sales in 1992 alone. If the company is to continue to supply leading edge services to its customers, it must command leading edge communications. To EDS, that means dumb and dark networks.

                         THE "DARK FIBER" CASE
   That need has driven EDS into an active role as an ex parte pleader in

Federal Case 911416, currently bogging down in the District of Columbia Federal Court of Appeals as the so-called "dark fiber" case. On the surface, the case, known as Southwestern Bell et al versus the Federal Communications Commission and the U.S. Justice Department, pits four Regional Bell telephone companies against the FCC. But the legal maneuvers actually reflect a rising conflict between the Bells and several large corporate clients over the future of communications.

   Beyond all the legal posturing, the question at issue is whether fiber

networks should be dumb and dark, and cheap, the way EDS and other customers like them. Or whether they should be bright and smart, and "strategically" priced, the way the telephone companies want them.

   On the side of intelligence and light are the phone companies;

Southwestern Bell, U.S. West, Bell South, and Bell Atlantic. The forces of darkness include key officials at the FCC and such companies as Shell Oil, the information services arm of McDonald Douglas, long distance network provider Wiltel, as well as EDS.

   For most of the four year course of the struggle, it has passed

unnoticed by the media. In summary, the issue may not seem portentous. The large corporate customers want dark fiber; the FCC mandates that it be supplied; the Bells want out of the business. But for all their obscurity, the proceedings raise what for the next twenty years will be the central issue in communications law and technology. The issue, if not the possible trial itself, will shape the future of both the computer and telephone industries during a period when they are merging to form the spearhead of a new information economy.

   "Dark fiber" is simply a glass fiber optic thread with nothing

attached to it, (ie. no light being sent through it). In this "unlit" condition, it is available for use without the intermediation of phone company electronics or intelligent services.

   In the mid-1980s, the Bells leased some of their dark fiber lines to

several large corporations on an individual case basis. These companies learned to love dark fiber. But when they tried to renew their leases with the Bells, the Bells clanged no! Why don't you leave the interconnections and protocols to us? Why don't you use our marvellous smart network with all the acronyms and intelligent services? Why don't you let us meter your use of the fiber and send you a convenient monthly bill for each packet of bits you send?

   EDS and the other firms rejected the offer; they preferred that dumb

fiber to the intelligent network. When the Bells persisted in an effort to deny new leases, the companies went to the FCC to require the Bells, as regulated "common carrier" telephone companies, to continue supplying dark fiber.

   In the fall of 1990, the FCC ruled that the phone companies would have

to offer dark fiber to all comers under the rules of common carriage. Rather than accept this new burden, the phone companies petitioned to withdraw from the business entirely under what is called a rule 214 application. Since the FCC has not acted on this petition, the Bells are preparing to go to court to force the issue. Their corporate customers are ready to litigate as well.

   It is safe to say that none of the participants fully comprehend the

significance of their courthouse confrontation. To the Bells, after all is said and done, the key problem is probably the price. Under the existing tariff, they are required to offer this service to anyone who wants it for an average price of approximately $150 per strand of fiber per month. As an offering that competes with their T-3 45 megabit (millions of bits) a second lines and other forthcoming marvels, dark fiber threatens to gobble up their future as vendors of broadband communications to offices, even as cable TV preempts them as broadband providers to homes. Since the Bells' profits on data are growing some 10 times as fast as their profits on voice telephony, they see dark fiber as a menace to their most promising markets.

   The technological portents, however, are far more significant even

than the legal and business issues. The coming triumph of dark fiber will mean not only the end of the telephone industry as we know it but also the end of the telephone industry as they plan it: a vast intelligent fabric of sophisticated information services. It also could mean a thoroughgoing restructuring of a computer industry increasingly dedicated to supplying "smart networks." Indeed, for most of the world's communications companies, professors of communications theory, and designers of new computer networks, the triumph of dark and dumb means "back to the drawing board," if not back to the dark ages.

   But the new dark ages cannot be held back.
   Springing out the depths of IBM's huge Watson Laboratories is a

powerful new invention: the all optical network, that will soon relegate all bright and smart executives to the Troglodyte file and make dumb and dark the winning rule in communications.

                          THE WRINGER EFFECT
   From time to time, the structure of nations and economies goes through

a technological wringer. A new invention radically reduces the price of a key factor of production and precipitates an industrial revolution. Before long, every competitive business in the economy must wring out the residue of the old costs and customs from all its products and practices.

   The steam engine, for example, drastically reduced the price of

physical force. Power once wreaked at great expense from human and animal muscle pulsed cheaply and tirelessly from machines burning coal and oil. Throughout the world, dominance inexorably shifted to businesses and nations that reorganized themselves to exploit the suddenly cheap resource. Eventually every human industry and activity, from agriculture and sea transport to printing and war, had to centralize and capitalize itself to take advantage of the new technology.

   Putting the world through the technological wringer over the last

three decades has been the integrated circuit, the IC. Invented by Robert Noyce of Intel and Jack Kilby of Texas Instruments in 1959, the IC put entire systems of tiny transistor switches, capacitors, resistors, diodes, and other once costly electronic devices on one tiny microchip. Made chiefly of silicon, aluminum, and oxygen, three of the most common substances on earth, the microchip eventually reduced the price of electronic circuitry by a factor of a million.

   As industry guru Andrew Rappaport has pointed out, electronic

designers now treat transistors as virtually free. Indeed, on memory chips, they cost some 400 millionths of a cent. To waste time or battery power or radio frequencies may be culpable acts, but to waste transistors is the essence of thrift. Today you use millions of them slightly to enhance your TV picture or to play a game of solitaire or to fax Doonsbury to Grandma. If you do not use transistors in your cars, your offices, your telephone systems, your design centers, your factories, your farm gear, or your missiles, you go out of business. If you don't waste transistors, your cost structure will cripple you. Your product will be either too expensive, too slow, too late, or too low in quality.

   Endowing every information age engineer or PC hacker with the creative

potential of a factory owner of the industrial age, the microchip reversed the centralizing thrust of the previous era. All nations and businesses had to adapt to the centrifugal law of the microcosm, flattening hierarchies, outsourcing services, liberating engineers, shedding middle management. If you did not adapt your business systems to the new regime, you would no longer be a factor in the world balance of economic and military power.

   During the next decade or so, industry will go through a new

technology wringer and submit to a new law: the law of the telecosm. The new wringer, the new integrated circuit, is called the all optical network. It is a communications system that runs entirely in glass. Unlike existing fiber optic networks, which convert light signals to electronic form in order to amplify or switch them, the all optical network is entirely photonic. From the first conversion of the signal from your phone or computer to the final conversion to voice or data at the destination, your message flies through glass on wings of light.

   Just as the old integrated circuit put entire electronic systems on

single slivers of silicon, the new IC will put entire communications systems on seamless webs of silica. Wrought in threads as thin as a human hair, this silica is so pure that you could see through a window of it seventy miles thick. But what makes the new wringer roll with all the force of the microchip revolution before it is not the purity but the price. Just as the old IC made transistor power virtually free, the new IC, the all optical network, will make communications power virtually free.

   Another word for communications power is bandwidth.  Just as the

entire world had to learn to waste transistors, the entire world will now have to learn how to waste bandwidth. In the 1990s and beyond, every industry and economy will go through the wringer again.

   The impact on the organization of companies and economies, however,

has yet to become clear. What is the law of the telecosm? Will the new technology reverse the centrifugal force of the microchip revolution…or consummate it? To understand the message of the new regime, we must follow the rule of microcosmic prophet Carver Mead of Caltech: "Listen to the technology…and find out what it is telling us."

                      THE SHANNON-SHOCKLEY REGIME
   The father of the all-optical-network, the man who coined the phrase,

built the first fully functional system, and wrote the definitive book on the subject, is Paul E. Green, Jr. of Watson Laboratory at IBM. Now standing directly in the path of Green's wringer is Robert Lucky, who some seven years ago at a conference at Cornell first gave Green the idea that an all optical network might be possible.

   The leading intellectual in telephony, Lucky recently shocked the

industry by shifting from AT&T's Bell Labs, where he was executive director of research, to Bellcore, the laboratory of the Regional Bell Operating Companies (RBOCs). There he will soon have to confront the implications of Green's innovation.

   Contemplating the new technology, Lucky recalls a course on data

networks that he used to teach many years ago with Green. As a computer man, Green relished the contrast between the onrushing efficiencies in his technology and the relative dormancy in communications. Indeed, for some twenty five years, while computer powers rose a millionfold, network capacities increased about a thousandfold. It was not until the late 1980s that most long distance data networks much surpassed the Pentagon's "ARPANET" running at 50 kilobits (thousands of bits) per second since the mid sixties.

   This was the era dominated by the powerful mathematic visions and

theories of Claude Shannon of MIT and Bell Labs. Shannon was the reclusive genius who invented Information Theory to ascertain the absolute carrying capacity of any communications channel.

   Whether wire or air, channels were assumed to be narrow and noisy, the

way God made them (sometimes with help from AT&T). Typical were the copper phone lines that still link every household to the telephone network and the air waves that still bear radio and television signals and static.

   The all-purpose remedy for these narrow, noisy channels was powerful

electronics. Invented at Bell Laboratories by a team headed by William Shockley and then developed by Robert Noyce and other Shockley proteges in Silicon Valley, silicon transistors and integrated circuits engendered a constant exponential upsurge of computing power.

   Throwing ever more millions of ever faster and cheaper transistors at

every problem, engineers created fast computers, multiplexors, and switches that seemed to surmount and outsmart every limit of bandwidth or restriction of wire. This process continues today with heroic new compression tools that allow the creation of full video conferences over 64 kilobit telephone connections. Scientists at Bellcore are now even proposing new ways of using the Motion Picture Engineering Group (MPEG) compression standard to send full motion movies at 1.5 megabits a second over the 4 kilohertz twisted pair copper wires to the home. Using ever faster computers, the telephone company is saying it can give you pay-per- view movies without installing fiber, or even coaxial cable, to the home.

   In the Shannon-Shockley era, the communications might be noisy and

error prone, but smart electronics could encode and decode messages in complex ways that allowed efficient identification and correction of all errors. The Shannon channel might be narrow, but fast multiplexors allowed it to be divided into time slots accommodating a large number of simultaneous users in a system called time division multiplexing. The channel might clog up when large numbers of users attempted to communicate with each other at once, but collision detectors or token passers could sort it all out in nanoseconds. Graphics and video might impose immense floods of bits on the system, but compression technology could reduce the floods to a manageable trickle with little or no loss of picture quality.

   If all else failed, powerful electronic switches could compensate for

almost any bandwidth limitations. Switching could make up for the inadequate bandwidth at the terminals by relieving the network of the need to broadcast all signals to every destination. Instead, the central switch could receive all signals and then route them to their appropriate addresses.

   To this day, this is the essential strategy of the telephone

companies: compensate for narrow noisy bandwidth with ever more powerful and intelligent digital electronics. Their "core competence," the Bells hasten to tell you, is switching. They make up for the shortcomings of copper wires by providing smart, powerful digital switches.

   Their vision for the future is to join the computer business all the

way, making these switches the entering wedge for ever more elaborate information services. Switches will grow smarter and more sophisticated until they provide an ever growing cornucopia of intelligent voice and fax features, from caller ID and voice mail to personal communications systems that follow you and your number around the world from your car commute to your vacation beach hideaway. In the end, these intelligent networks could supply virtually all the world's information needs, from movies, games and traffic updates to data libraries, financial services, news programs, and weather reports, all climaxing with yellow pages that exfoliate into a gigantic global mall of full motion video where your fingers walk (or your voice commands echo) from Harrods, to Jardines, to Akihabara, to Century 21 without you leaving the couch.

   At the time when Green and Lucky taught their course, this strategy

for the future was only a glimmer in the minds of telephone visionaries.

But the essence of it was already in place. As Green pointed out, telephone companies' response to sluggishness in communications was to enter the computer industry, where progress was faster. The creativity of digital electronics would save the telephone industry from technical stagnation.

   Lucky, however, protested to Green that it was unjust to compare the

two fields. Computers and telecom, as Lucky explained it, operate on entirely different scales. Computers work in the microscale world of the IC, putting ever more thousands of wires and switches on single slivers of silicon.

   By contrast, telecommunications functions in the macroworld, laying

out wires and switches across mostly silicon landscapes and seabeds. It necessarily entails a continental, or even intercontinental stretch of cables, microwave towers, switches, and poles. "How was it possible," Lucky asked, "to make such a large scale system inexpensive?" Inherent in the structure and even the physics of computers and telecommunications, so it seemed to Lucky two decades ago, was a communications bottleneck.

   As Lucky remembers it, Green was never satisfied with Lucky's point.

Green believed that someday communications could achieve miracles comparable to the integrated circuit in computing….

                         THE BANDWIDTH SCANDAL
   Today, as Lucky was the first to announce, fiber optics has utterly

overthrown the previous relationship between fast computers and slow wires. Now it is computer technology that imposes the bottleneck on the vast vistas of dark fiber.

   A silicon transistor can change its state some 2.5 billion times a

second in response to light pulses (bundles of photons) hitting a photo- detector. Since it would take a human being a thousand years or so of 10 hour workdays even to count to two billion, two billion cycles in a single second (two gigahertz) might seem a sprightly pace. But in the world of fiber optics running at the speed and frequencies of light, even a rate of two billion cycles a second is a humbling bow to the slothful pace of electronics. Since optical signals still have to be routed to their destinations through computer switches, communications now suffers from what is known as the "electronic bottleneck."

   It is this electronic bottleneck, the entire Bell edifice of Shannon

and Shockley, that Paul Green plans to blow away with his all optical networks. Green is targeting what is a secret scandal of modern telecommunications: the huge gap between the real capacity of fiber optics and the actual speed of telephone communications.

   In communications systems, the number of waves per second (or hertz)

represents a rough measure of its potential bandwidth or ultimate carrying capacity. The bandwidth of a radio system, for example, is determined by the frequency of each station or channel and by the number of stations that can fit within the band. Your AM dial, for example, runs from around 535 thousand hertz (kilohertz) to 1705 kilohertz and each station uses some 10 kilohertz. With an ideal receiver, the AM passband might carry 117 stations.

   By contrast, the intrinsic bandwidth of one strand of dark fiber is

some 25 thousand gigahertz in each of three groups of frequencies (three passbands) through which fiber can transmit light over long distances. At a gigahertz per terminal, this bandwidth might accommodate some 25,000 supercomputer "stations" (or 2.5 billion AM stations). Using what is called dispersion shifted fiber, it may be possible to use two of these passbands at once: a total of some 40 or 50 thousand gigahertz. For comparison, consider all the radio frequencies currently used in the air for radio, television, microwave, and satellite communications and multiply by two thousand. The bandwidth of one fiber thread could carry more than two thousand times as much information as all these radio and microwave frequencies that currently comprise the "air." One fiber thread could bear twice the traffic on the phone network during the peak hour of Mothers' Day in the U.S. (the heaviest load currently managed by the phone system).

   Yet even for point-to-point long distance links, let alone connections

to homes, telephone and computer network engineers now turn their backs on this immense capacity and use perhaps one or two fifty thousandths it. Deferring to the electronic bottleneck, the telephone industry uses fiber merely as a superior replacement for the copper wires, coaxial cables, satellite links, and microwave towers that connected the local central office switches to one another for long distance calls.

   Over the last 15 years, the Bell Laboratory record for fiber optics

communication has run from 10 megabits per second over a one kilometer span to some 10 gigabits per second over nearly one thousand kilometers. But all the heroic advances in point-to-point links between central offices continued to use essentially one frequency on a fiber thread, while ignoring its intrinsic power to accommodate thousands of useful frequencies.

   In a world of all optical networks, this strategy is bankrupt.  No

longer will it be possible to throw more transistors, however cheap and fast, at the switching problem. Electronic speeds have become an insuperable bottleneck obstructing the vast vistas of dark fiber beyond.

   So called gigabit networks planned by the telephone and computer

companies will not do. What is needed is not a gigabit spread among many terminals, but a large network functioning at a gigabit per second per terminal.

   The demands of EDS offer a hint of the most urgent business needs.

Added to them will be consumer demands. True high definition television, comparable to movies in resolution, requires close to gigabit-a-second bandwidth, particularly if the program is dispatched to the viewer in burst mode all at once in a few seconds down the fiber, or if the user is given a chance to shape the picture, choose a vantage point, window several images at once, or experience three dimensions. When true broadband channels become available, there will be a flood of new applications comparable to the thousands of new uses of the IC.

   No foreseeable progress in electronics can overcome the electronic

bottleneck. To do that, we need an entirely new communications regime. In the form of the all optical network, this regime is now at hand.

            LAW OF THE TELECOSM:  NETWORKS DUMB AS A STONE

The new regime will use fiber not as a replacement for copper wires but as a new form of far more capacious and error-free air. Through a system called wavelength division multiplexing and access, computers and telephones will tune into desired messages in the fibersphere the same way radios now tune into desired signals in the atmosphere. The fibersphere will be intrinsically as dumb and dark as the atmosphere.

   The new regime overcomes the electronic bottleneck by altogether

banishing electronics from the network. But, ask the telcos in unison, what about the switches? As long as the network is switched, it must be partly electronic. Unless the network is switched, it is not a true any- to-any network. It is a broadcast system. It may offer a cornucopia of services. But it cannot serve as a common carrier like the phone network allowing any party to reach any other. Without intelligent switching it cannot provide personal communications nets that can follow you wherever you go. Without intelligent switching, the all optical network, so they say, is just a glorified cable system.

   These critics fail to grasp a central rule of the telecosm:  bandwidth

is a nearly perfect substitute for switching. With sufficient physical bandwidth, it is possible to simulate any kind of logical switch whatsoever. Bandwidth allows creation of virtual switches that to the user seem to function exactly the way physical switches do. You can send all messages everywhere in the network, include all needed codes and instructions for correcting, decrypting, and reading them, and allow each terminal to tune into its own messages on its own wavelength, just like a two-way radio. When the terminals are smart enough and the bandwidth great enough, your all optical network can be as dumb as a stone.

   Over the last several years, all optical network experiments have been

conducted around the world, from Bellcore in New Jersey to NTT at Yokosuka, Japan. British Telecom has used wavelength division multiplexing to link four telephone central offices in London. Columbia's Telecom Center has launched a "Teranet" that lacks tunable lasers or receivers but can logically simulate them. Bell Laboratories has generated most of the technology but as a long distance specialist has focussed on the project of sending gigabits of information thousands of miles without amplifiers. But only fully functional system is the Rainbow created by Paul Green at IBM.

   As happens so often in this a world of technical disciplines sliced

into arbitrary fortes and fields, the large advances come from the integrators. Paul Green is neither a laser physicist, nor an optical engineer, nor a telecommunications theorist. At IBM, his work has ranged from overseeing speech recognition projects at Watson Labs to shaping company strategy at corporate headquarters in Armonk. His most recent success was supervising development of the new APPN (Advanced Peer to Peer Network) protocol. According to an IBM announcement in March, APPN will replace the venerable SNA (systems network architecture) that has been synonymous with IBM networking for more than a decade.

   Green took some pride in this announcement, but by that time, the

project was long in his past. He was finishing the copy editing on his magisterial tome on Fiber Optic Networks (published this summer by Prentice Hall). And he was moving on to more advanced versions of the Rainbow which he and his team had introduced in December 1991 at the Telecom 91 Conference in Geneva and which has been installed between the various branches of Watson Laboratories in Westchester County, N.Y.

   As Peter Drucker points out, a new technology cannot displace an old

one unless it is proven at least 10 times better. Otherwise the billions of dollars worth of installed base and thousands of engineers committed to improving the old technology will suffice to block the new one. The job of Paul Green's 15 man team at IBM is to meet that tenfold test.

   Green's all optical network creates a fibersphere as neutral and

passive as the atmosphere. It can be addressed by computers the same way radios and television sets connect to the air. Consisting entirely of unpowered glass and passive spitters and couplers, the fibersphere is dark and dumb. Any variety of terminals can interconnect across it at the same time using any protocols they choose.

   Just as radios in the atmosphere, computer receivers connected to the

fibersphere do not find a series of bits in a message; they tune into a wavelength or frequency. Because available Fabry Perot tunable filters today have larger bandwidth than tunable lasers, Green chose to locate Rainbow's tuning at the receiver and have transmitters each operate at a fixed wavelength. But future networks can use any combination of tunable equipment at either end.

   When Green began the project in 1987, the industry stood in the same

general position as the pioneers of radio in the early years of that industry. They had seemingly unlimited bandwidth before them, but lacked transmitters and receivers powerful enough to use it effectively. Radio transmitters suffered "splitting losses" as they broadcast their signals across the countryside. Green's optical messages lose power everytime they are split off to be sent to another terminal or are tapped by a receiver.

   The radio industry solved this problem by the development of the

audion triode amplifier. Green needed an all optical amplifier to replace the optoelectronic repeaters that now constitute the most widespread electronic bottleneck in fiber. Amplifiers in current fiber networks first convert the optical signal to an electronic signal, enhance it, and then convert it back to photons.

   Like the pioneers of radio, Green soon had his amplifier in hand.

Following concepts pioneered by David Payne at the University of Southhampton in England, a Bell Laboratories group led by Emmanuel Desurvire and Randy Giles developed a workable all optical device. They showed that a short stretch of fiber doped with erbium, a rare earth mineral, and excited by a cheap laser diode, can function as a powerful amplifier over the entire wavelength range of a 25,000 gigahertz system. Today such photonic amplifiers enhance signals in a working system of links between Naples and Pomezia on the west coast of Italy. Manufactured in packages between two and three cubic inches in size, these amplifiers fit anywhere in an optical network for enhancing signals without electronics.

   This invention overcame the most fundamental disadvantage of optical

networks compared to electronic networks. You can tap into an electronic network as often as desired without weakening the voltage signal. Although resistance and capacitance will weaken the current, there are no splitting losses in a voltage divider. Photonic signals, by contrast, suffer splitting losses every time they are tapped; they lose photons until eventually there are none left. The cheap and compact all optical amplifier solves this problem.

   Not only did Green and his IBM colleagues create working all optical

networks, they also reduced the interface optoelectronics to a single microchannel plug-in card that can fit in any IBM PS/2 level personal computer or R6000 workstation. Using off-the-shelf components costing a total of $16,000 per station, Rainbow achieved a capacity more than 90 times greater than FDDI at an initial cost merely four times as much.

   Just as Jack Kilby's first ICs were not better than previous adders

and oscillators, the Rainbow I is not better in some respects than rival networks based on electronics. At present it connects only 32 computers at a speed of some 300 megabits per second, for a total bandwidth of 9.5 gigabits. This rate is huge compared to most other networks, but it is still well below the target of a system that provides gigabit rates for every terminal.

   A more serious limitation is the lack of packet switching.  Rather

than communicating down a dedicated connection between two parties, like phones do, computer networks send data in small batches, called packets, each bearing its own address. This requires switching back and forth between packets millions of times a second. Neither the current Rainbow's lasers nor its filters can tune from one message to another more than thousands of times a second. This limitation is a serious problem for links to mainframes and supercomputers that may do many tasks at once in different windows on the screen and with connections to several other machines.

   As Green shows, however, all these problems are well on the way to

solution. A tide of new interest in all optical systems is sweeping through the world's optical laboratories. The Pentagon's Defense Advanced Projects Agency (DARPA) has launched a program for all optical networking. With Green installed as the new President of the IEEE Communications Society, the technical journals are full of articles on new wavelength division technology. Every few months brings new reports of a faster laser with a broader bandwidth, or filter with faster tuning, or an ingenious new way to use bandwidth to simulate packet switching. Today lasers and receivers can switch fast enough but they still lack the ability to cover the entire bandwidth needed.

   The key point, however, is that as demonstrated both in Geneva and

Armonk, the Green system showed the potential efficiency of all optical systems. Even in their initial forms they are more cost effective in bandwidth per dollar than any other network technology. Scheduled for introduction within the next two years, Rainbow III will comprise a thousand stations operating at a gigabit a second, with the increasingly likely hope of fast packet switching capability. At that point, the system will be a compelling commercial product at least hundreds of times more cost effective than the competition.

   Without access to dark fiber, however, these networks will be

worthless. If the telephone companies fail to supply it, they risk losing most of the fastest growing parts of their business: the data traffic which already contributes some 50 percent of their profits. But there is also a possibility that they will lose much of their potential consumer business as well: the planned profits in pay-per-view films and electronic yellow pages. This is the message of a second great prophet of dark fiber, Will Hicks of Southbridge, Massachusetts.

   A venerable inventor of scores of optical products, Hicks believes

that Green's view of the future of fiber is too limited. Using wavelength division, Hicks can see the way to deliver some 500 megahertz two-way connections to all the homes in America for some $400 per home. That is fifty times the 10 megahertz total capacity of an Ethernet (with no one else using it) for some 20 percent of the cost. That is capacity in each home for twenty digital two-way HDTV channels at once at perhaps half the cost of new telephone connections. Then, after a large consumer market emerges for fiber optics, Hicks believes, Green's sophisticated computer services will follow as a matter of course.

   The consumer market, Hicks maintains, is the key to lowering the cost

of the components to a level where they can be widely used in office networks as well. He cites the example of the compact disk laser diode. Once lasers were large and complex devices, chilled with liquid nitrogen, and costing thousands of dollars; now they are as small as a grain of salt, cheap as a box of cereal, and more numerous than phonograph needles. An executive at Hitachi told Hicks that Hitachi could work a similar transformation on laser diodes and amplifiers for all optical networks. "Just tell me what price you want to pay and I'll tell you how many you have to buy."

   The divergence of views between the IBM executive and the wildcat

inventor, however, is far less significant than their common vision of dark fiber as the future of communications. By the power of ever cheaper bandwidth, it will transform all industries of the coming information age just as radically as the power of cheaper transistors transformed the industries of the computer age.

   For the telephone companies, the age of ever smarter terminals

mandates the emergence of ever dumber networks. This is a major strategic challenge; it takes a smart man to build a dumb network. But the telcos have the best laboratories and have already developed nearly all the components of the fibersphere.

   Telephone companies may complain of the large costs of the

transformation of their system, but they command capital budgets as large as the total revenues of the cable industry. Telcos may recoil in horror at the idea of dark fiber, but they command webs of the stuff ten times larger than any other industry. Dumb and dark networks may not fit the phone company self-image or advertising posture. But they promise larger markets than the current phone company plan to choke off their future in the labyrinthine nets of an "intelligent switching fabric" always behind schedule and full of software bugs.

   The telephone companies cannot expect to impose a uniform network

governed by universal protocols. The proliferation of digital protocols and interfaces is an inevitable effect of the promethean creativity of the computer industry. Green explains, "You cannot fix the protocol zoo. You must use bandwidth to accommodate the zoo."

   As Robert Pokress, a former switch designer at Bell Labs now head of

Unifi Corporation, points out, telephone switches (now 80 percent software) are already too complex to keep pace with the efflorescence of relatively simple computer technology on their periphery. While computers become ever more lean and mean, turning to reduced instruction set processors, networks need to adopt reduced instruction set architectures. The ultimate in dumb and dark is the fibersphere now incubating in their magnificent laboratories.

   The entrepreneurial folk in the computer industry may view this

wrenching phone company adjustment with some satisfaction. But the fact is that computer companies face a strategic reorientation as radical as the telcos do. In a world where ever smarter terminals require ever dumber communications, computer networks are as gorged and glutted with smarts as phone company networks and even less capacious. The nation's most brilliant nerds, commanding the 200 MIPS Silicon Graphics superstations or Mac Quadra multimedia power plants, humbly kneel before the 50 kilobit lines of the Internet and beseech the telcos to upgrade to 64 kilobit basic ISDN.

   Now addicted to the use of transistors to solve the problems of

limited bandwidth, the computer industry must use transistors to exploit the opportunities of nearly unlimited bandwidth. When home-based machines are optimized for manipulating high resolution digital video at high speeds, they will necessarily command what are now called supercomputer powers. This will mean that the dominant computer technology will emerge first not in the office market but in the consumer market. The major challenge for the computer industry is to change its focus from a few hundred million offices already full of computer technology to a billion living rooms now nearly devoid of it.

   Cable companies possess the advantage of already owning dumb networks

based on the essentials of the all optical model of broadcast and select– of customers seeking wavelengths or frequencies rather than switching circuits. Cable companies already provide all the programs to all the terminals and allow them to tune in to the desired messages. Uniquely in the world, U.S. cable firms already offer a broadband pipe to ninety percent of American homes. These coaxial cables, operating at one gigahertz for several hundred feet, provide the basis for two way broadband services today. But the cable industry cannot become a full service supplier of telecommunications until it changes its self-image from a cheap provider of one way entertainment services into a common carrier of two way information. Above all, the cable industry cannot succeed in the digital age if it continues to regard the personal computer as an alien and irrelevant machine.

   Analogous to the integrated circuit in its economic power, the all

optical network is analogous to the massively parallel computer in its technical paradigm. In the late 1980s in computers, the effort to make one processor function ever faster on a serial stream of data reached a point of diminishing returns. Superpipelining and superscalar gains hit their limits. Despite experiments with Josephson Junctions, high electron mobility, and cryogenics, usable transistors simply could not made to switch much faster than a few gigahertz.

   Computer architects responded by creating machines with multiple

processors operating in parallel on multiple streams of data. While each processor worked more slowly than the fastest serial processors, thousands of slow processors in parallel could far outperform the fastest serial machines. Measured by cost effectiveness, the massively parallel machines dwarfed the performance of conventional supercomputers.

   The same pattern arose in communications and for many of the same

reasons. In the early 1990s the effort to increase the number of bits that could be time division multiplexed down a fiber on a single frequency band had reached a point of diminishing returns. Again the switching speed of transistors was the show stopper. The architects of all optical networks responded by creating systems which can use not one wavelength or frequency but potentially thousands in parallel.

   Again, the new systems could not outperform time division multiplexing

on one frequency. But all optical networks opened up a vast vista of some 75 thousand gigahertz of frequencies potentially usable for communications. That immense potential of massively parallel frequencies left all methods of putting more bits on a single set of frequencies look as promising as launching computers into the chill of outer space in order to accelerate their switching speeds.

   Just as the law of the microcosm made all terminals smart,

distributing intelligence from the center to the edges of the network, so the law of the telecosm creates a network dumb enough to accommodate the incredible onrush of intelligence on its periphery. Indeed, with the one chip supercomputer on the way, manufacturable for under a hundred dollars toward the end of the decade, the law of the microcosm is still gaining momentum. The fibersphere complements the promise of ubiquitous computer power with equally ubiquitous communications.

   What happens, however, when not only transistors but also wires are

nearly free? As Robert Lucky observes in his forward to Paul Green's book, "Many of us have been conditioned to think that transmission is inherently expensive; that we should use switching and processing wherever possible to minimize transmission." This is the law of the microcosm. But as Lucky speculates, "The limitless bandwidth of fiber optics changes these assumptions. Perhaps we should transmit signals thousands of miles to avoid even the simplest processing function." This is the law of the telecosm: use bandwidth to simplify everything else.

   Daniel Hillis of Thinking Machines Corporation offers a similar

vision, adding to Lucky's insight the further assertion that massively parallel computer architectures are so efficient that they can overthrow the personal computer revolution. Hillis envisages a powerplant computer model, with huge Thinking Machines at the center tapped by millions of relatively dumb terminals.

   All these speculations assume that the Law of the Telecosm usurps the

Law of the Microcosm. But in fact the two concepts function in different ways in different domains.

   Electronic transistors use electrons to control, amplify, or switch

electrons. But photonics differ radically from electronics. Because moving photons do not affect one another on contact, they cannot readily be used to control, amplify, or switch each other. Compared to electrons, moreover, photons are huge: infrared photons at 1550 or 1300 nanometers are larger than a micron across. They resist the miniaturization of the microcosm. For computing, photons are far inferior to electrons. With single electron electronics now in view, electrons will keep their advantage. For the foreseeable future, computers will be made with electrons.

   What are crippling flaws for photonic computing, however, are huge

assets for communicating. Because moving photons do not collide with each other or respond to electronic charges, they are inherently a two way medium. They are immune to lightning strikes, electromagnetic pulses, or electrical power surges that destroy electronic equipment. Virtually noiseless and massless pulses of radiation, they move as fast and silently as light.

   Listening to the technology, as Caltech prophet Carver Mead

recommends, one sees a natural division of labor between photonics and electronics. Photonics will dominate communications and electronics will dominate computing. The two technologies do not compete; they are beautiful complements of each other.

   The law of the microcosm makes distributed computers (smart terminals)

more efficient regardless of the cost of linking them together. The law of the telecosm makes dumb and dark networks more efficient regardless of how numerous and smart are the terminals. Working together, however, these two laws of wires and switches impel ever more widely distributed information systems.

   It is the narrow bandwidth of current phone company connections that

explains the persistence of centralized computing in a world of distributed machines. Narrowband connections require smart interfaces and complex protocols and expensive data. Thus you get your online information from only a few databases set up to accommodate queries over the phone lines. You limit television broadcasting to a few local stations. Using the relatively narrowband phone network or television system, it pays to concentrate memory and processing at one point and tap into the hub from thousands of remote locations.

   Using a broadband fiber system, by contrast, it will pay to distribute

memory and services to all points on the network. Broadband links will foster specialization. If the costs of communications are low, databases, libraries, and information services can specialize and be readily reached by customers from anywhere. On line services lose the economies of scale that lead a firm such as Dialog to attempt to concentrate most of the world's information in one set of giant archives.

   By making bandwidth nearly free, the new integrated circuit of the

fibersphere will radically change the environment of all information industries and technologies. In all eras, companies tend to prevail by maximizing the use of the cheapest resources. In the age of the fibersphere, they will use the huge intrinsic bandwidth of fiber, all 25 thousand gigahertz or more, to replace nearly all the hundreds of billions of dollars worth of switches, bridges, routers, converters, codecs, compressors, error correctors, and other devices, together with the trillions of lines of software code, that pervade the intelligent switching fabric of both telephone and computer networks.

   The makers of all this equipment will resist mightily.  But there is

no chance that the old regime can prevail by fighting cheap and simple optics with costly and complex electronics and software.

   The all optical network will triumph for the same reason that the

integrated circuit triumphed: it is incomparably cheaper than the competition. Today, measured by the admittedly rough metric of MIPS per dollar, a personal computer is more than one thousand times more cost effective than a mainframe. Within 10 years, the all optical network will be millions of times more cost effective than electronic networks. Just as the electron rules in computers, the photon will rule the waves of communication.

   The all optical ideal will not immediately usurp other technologies.

Vacuum tubes reached their highest sales in the late 1970s. But just as the IC inexorably exerted its influence on all industries, the all optical technology will impart constant pressure on all other communications systems. Every competing system will have to adapt to its cost structure. In the end, almost all electronic communications will go through the wringer and emerge in glass.

   This is the real portent of the dark fiber case wending its way

through the courts. The future of the information age depends on the rise of dumb and dark networks to accommodate the onrush of ever smarter electronics. Ultimately at stake is nothing less than the future of the computer and communications infrastructure of the U.S. economy, its competitiveness in world markets, and the consummation of the age of information. Although the phone companies do not want to believe it, their future will be dark.

                                #####

— ~ 1st 1.10b #1477 ~ – Channel 1 (R) Cambridge, MA

/data/webs/external/dokuwiki/data/pages/archive/internet/fibersph.ere.txt · Last modified: 2001/01/13 03:14 by 127.0.0.1

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki