By Peterson, Ivars
Science News , Vol. 133, No. 25
Highways for Information
In 1900, driving across the United States from coast to coast was a great challenge. Automobiles and horses shared the same dirt roads. There were few road maps or signs - and no guarantees that one road actually connected with another. There were no service. Drivers carried their own fuel and needed enough mechanical know-how to do their own repairs.
"That's very close to what computing is like today," says Robert E. Kahn, who developed ARPANET, a computer network linking researchers holding contracts with the Department of Defense. Kahn now heads the Corporation for National Research Initiatives in Washington, D.C., a group committed to the idea of building a nationwide, interstate highway system for information.
"Nowadays, you don't have to know very much to use a road," he says. In the same way, using a computer to communicate with colleagues, to share data, to send and receive pictures, and to draw upon library resources anywhere in the United States should be just as easy and convenient.
The present situation, however, is far from that ideal -- even for networks devoted strictly to research. For example, the best way for a cardiologist at the Boston University Medical Center to review cardiac images with a colleague at the Mayo Clinic in Minnesota is to send the material by express mail or to fly there personally. Because no direc tlink capable of handling the images exists, the researcher can't use his or her office computer to send the information electronically. In contrast, scientists at the Massachusetts Institute of Technology can communicate with many research organizations throughout the world, but they must use the right one of more than a dozen computer networks to do so.
"These scenarios point up just two absurdities of the present situation in U.S. computer networking," comments C. Gordon Bell of the ARdent Computer Corp. in Sunnyvale, Calif., in the February IEEE SPECTRUM. "Existing networks not only lag behind the growing needs of the research community -- they are too fragmented to develop unaided into a single, coherent system."
The troubled state of U.S. computer networks is one of the topics addressed in a report to Congress from the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), which oversees the scientific endeavors of several federal agencies (SN: 3/12/88, p.172). The issue was also the focus of a recent meeting in Washington, D.C., sponsored by EDUCOM, a consortium based in Princeton, N.J., of more than 500 colleges and universities interested in information technology.
Some pieces of the national-network puzzle are already in place. Communications lines now connect six national supercomputer centers funded by the National Science Foundation (NSF). This network forms the backbone of NSFNET, which also has links with ARPANET and several NASA laboratories. By 1989, more than 200 universities will have access to the network.
In July, the network managers expect to increase data transmission rates from a horse-and-buggy rate of 56 kilobits per second to a respectable but hardly supersonic 1.5 megabits per second. Even at the faster rate, researchers would have trouble sending even a single picture without using up a large part of the line's capacity. That doesn't leave much room for other users interested in doing the same thing at the same time. Further increases in data transmission rates are planned.
"A lot has been accomplished," says Kenneth M. King, EDUCOM president. "But the task has just begun, and there are many problems."
The United States already has more than 100 computer networks, linking government laboratories, Defense Department operations, groups of universities, and researchers within specialized fields such as high-energy physics and computer science. Many large companies operate private networks carrying data to and from facilities all over the world. …