Space Shuttles Bound to Technologies of the Past

By Shankar Vedantam

Washington Post Staff Writer

Tuesday, February 25, 2003; Page A01

University of Maryland reliability expert Michael Pecht was recently approached by a company that wanted to obtain an evaluation of an older piece of machinery.

The company was a NASA subcontractor, and the device to be evaluated was the space shuttle's robotic arm, which astronauts use to work outside the craft. It was 20 years old, and NASA -- under pressure to extend the life of the spacecraft -- was anxious to find out how long it could last.

Pecht found that the arm was still in good working order. But while the engineer answered NASA's questions, the space agency never answered his. "Why are we using this old technology?" he asked repeatedly. "Why don't we change the ways we buy and design so we can always be updating, so we can always be putting in the latest technology? I could never get the clearest answer on that."

America's space shuttles, once heralded as futuristic, today find themselves chained to the technologies of the past. The machines that fly astronauts into space represent design ideas from the 1970s. Many components that run vital parts of the shuttles are obsolete, and NASA has had to create a network of suppliers to provide it with vintage parts, occasionally even scouring Internet sites such as eBay.

Although older components are not being blamed for the loss of Columbia on Feb. 1 -- indeed, NASA hangs on to many older parts because they are reliable -- the shuttle fleet's intensifying battle against obsolescence will figure prominently in the coming debate about the space program's future.

Critics suggest there is a real safety issue: As obsolete parts become ever more difficult to find, the viability of systems may be threatened not by the challenges of the future, but by the requirements of the past. The paradox for NASA is that the longer it keeps its shuttle fleet running, the further it falls behind in terms of technology, especially in its computer systems.

The age gap is not only about machines. Engineers at NASA are graying, too, and critics say budget cuts and a lack of bold goals have eroded the agency's ability to attract young engineers focused on visionary change.

"It seemed futuristic, but 30 years later, the shuttle program is a shell of the future," said Rosalind Williams, head of the Massachusetts Institute of Technology's Program in Science, Technology and Society. "It's not only that the components are difficult to find but the expertise is difficult to find. What engineer wants to go into this business when it is basically about maintenance?"

Solutions will not be easy -- or cheap. Unlike unmanned missions, where a stream of new vehicles means technology can be regularly updated, putting astronauts in space requires lengthy design periods and extreme caution. It takes years to test components, and once a system is ready, engineers are wary of making changes. Yet without change, obsolescence is certain.

Cost, engineering and design barriers have forced the space program to run hard just to stay in place. But the biggest impediment to change goes back to the strategic decision to make the shuttle the centerpiece of the U.S. manned spaceflight program.

"The problem that NASA has faced is they put all their eggs in the shuttle basket," said Bruce Murray, a former director of NASA's Jet Propulsion Laboratory in Pasadena, Calif. "The fundamental problems are conceptual in design. It was promoted and sold as a very safe, cheap way to access space. It was neither safe nor cheap."

Early plans to fly 50 shuttle missions a year were quickly halved, and then halved again, Murray said. After the Challenger explosion, even that seemed too ambitious. Instead of quick turnarounds, NASA began to focus on extending shuttle longevity. Now the agency is considering flying the fleet until 2020.

Given that the first shuttle flew in 1981, that would mean a lifespan of nearly 40 years for the fleet. For a highly complex system that regularly faces the immense rigors of space travel, that is an extraordinary length of time.

In the case of the shuttle fleet's computer systems, such longevity runs smack into what is known as Moore's law. Named after Intel Corp. founder Gordon Moore, the law predicts that the sophistication of microchip technology will double every couple of years, meaning that a score of technological generations will be packed into 40 years.

The shuttle fleet's IBM computers have been upgraded once -- in 1988-89.

"They have these ancient computers that are really pathetic," said Jonathan McDowell, an astronomer at the Smithsonian Astrophysical Observatory in Cambridge, Mass., and a space program analyst. "They are many years out of date."

Indeed, to run high-speed science experiments, McDowell said, astronauts have to carry and plug in laptops. "It's a strange mix of very robust but very old computers that will absolutely work, and a bunch of notebooks that are running the latest version of Windows," he said.

The five main computers that run each shuttle have a memory of about 1 megabyte apiece, McDowell said. Today's most basic home desktop computers come loaded with 20,000 times as much and have Pentium processors. Two years ago, Intel turned over its original Pentium processor to the government so that it could be tested and prepared for space travel, said Chuck Mulloy, a company spokesman. But that processor came out in 1994, meaning that even as it is being readied for space travel, it is already nearly a decade old.

"The computers haven't changed a lot since the advent of the vehicle," agreed Jeff Carr, a spokesman at United Space Alliance, a Houston company that runs the shuttle fleet. "It's one of those things that are very adequate for the job and have always been very adequate. They don't need to be faster. . . . There has never been any impetus or need to change them."

The testing of processors and computing equipment is extraordinarily rigorous, Carr and others said, and NASA has always placed reliability ahead of speed. A home desktop computer that crashes once a week is merely annoying, but a failed computer aboard a space shuttle could be catastrophic.

Computer chips and other components are subjected to intense bouts of radiation testing, and the software that runs the shuttles may be among the cleanest programs ever written.

Paradoxically, one reason that newer computer chips are superior -- they pack more components and circuits into smaller spaces -- can make them more vulnerable in space. A single cosmic ray, a stream of high-energy particles in space, might damage a large number of transistors in a densely packed chip, while previously it would have damaged only a few, McDowell said.

The space agency is taking into account the rapid pace of technological change in designing the next generation of space vehicles, said Gary Martin, NASA's space architect. One idea, he said, is to design systems based on modules. When better technology arrives, a module can be pulled out and replaced with something better.

Standardizing different sections can also help, he said. Martin drew an analogy to gas tanks in cars. Although newer models come out frequently, the tank is standardized to fit nozzles in gas stations anywhere in the country. In the same way, he said, parts of launch vehicles can be standardized, making it easier to incorporate improvements.

Ultimately, the scale of designing a new launch system and the complexity of making components work together mean the agency will always be behind the cutting edge, said John Rogacki, chief of the Space Transportation Technology Division in NASA's Office of Aerospace Technology.

"We don't build new space transportation systems very often," he said. "You may create an open architecture where you plug and play computers, but it's much more difficult if you are talking of a propulsion system. We come out with a new rocket engine every 20 years."

Once a decision has been made to start implementing a design, so much testing and work has been done that "you just have to make a decision and go with it and save the new technology for a new system," Rogacki said.

Obsolescence is a growing issue outside NASA, and new ideas for design may come from teams working on military, commercial aviation and other complex systems, said Pecht, the University of Maryland engineer, who directs the university's CALCE Electronic Products and Systems Center.

By their very nature, improvements in science are unpredictable. It is safe to predict there will be widespread technological improvements over the next 10 years, but it is very difficult to predict what exactly those breakthroughs will be.

"How do you make the systems flexible enough to take account of technological improvements that will surely come along in the life of the system?" asked Norine Noonan, a member of NASA's Advisory Council. "It's easier to ask the question than answer it."

© 2003 The Washington Post Company

Published February 25, 2003