News
Supercomputing now indispensable
Published November 15, 2005
By Bruce Lieberman
STAFF WRITER, San Diego Union Tribune
November 12, 2005
Jean-Bernard Minster wants to know how a magnitude-7.7 earthquake would affect Southern California. J. Andrew McCammon wants to find a cure for AIDS. Michael Norman wants to learn how the universe began.
All of them rely on supercomputers in their quest for answers.
CRISSY PASCUAL / Union-Tribune UCSD astronomer Michael Norman uses the powerful resources of the the San Diego Supercomputer Center to help figure out how stars were born. |
Twenty years ago this Monday, the San Diego Supercomputer Center began using what was then the world's most powerful computer. Now, its data-crunching successors worldwide are indispensable to science, engineering, business, even the war on terrorism.
"Without having these computational resources, it would be a much darker science," McCammon said of his research at the center, located at the University of California San Diego. "You might be able to carry out experiments, but (you wouldn't) clearly understand what the molecules look like and what they're doing. It would be much, much more difficult to make progress."
Today's top supercomputer can do about 280 trillion calculations per second - at least 150,000 times faster than a quick consumer desktop. Jobs that would take an average laptop years to complete might take a supercomputer a few hours.
Last fall, for example, Minster and other scientists used the center's largest supercomputer to simulate how a quake would shake the ground from Los Angeles to San Diego - information vital for designing buildings that are more resistant to seismic damage.
San Diego Supercomputer Center Established: Nov. 14, 1985 Employees: 400 Annual budget: $80 million Monthly electric bill: $80,000 Top computer: IBM DataStar Web site: www.sdsc.edu GLOSSARY Byte: A unit of computer information equal to one typed character. Megabyte: A million bytes. A short novel has about this many bytes. Terabyte: A trillion bytes, or about equal to the information printed on paper made from 50,000 trees. SOURCES: San Diego Supercomputer Center; SearchStorage.com; Webopedia.com SOME USES FOR SUPERCOMPUTERS
|
The multimillion-dollar supercomputer, an IBM machine called DataStar, spit out 47 trillion bytes of information in less than five days. That's more than four times the amount found in the Library of Congress' printed collection.
The earthquake simulation was one of the largest ever conducted.
Supercomputing is at a turning point, many experts say. The United States holds the record for the fastest supercomputer - an IBM machine called Blue Gene/L that's housed at the Lawrence Livermore National Laboratory in Northern California. But China, Japan and other nations have launched ambitious programs designed to help them take the lead.
"U.S. pre-eminence in supercomputing, which is imperative for national security and indispensable for scientific discovery, is in jeopardy," three researchers wrote last summer in the journal Issues in Science and Technology.
Susan L. Graham, Marc Snir and Cynthia A. Patterson served on a National Research Council committee that produced the 2004 report, "Getting Up to Speed: The Future of Supercomputing."
They, other scientists and policy experts agree on the essential role of supercomputers.
The machines are vital not only in cancer research and gene studies, but also in making sense of the flood of defense intelligence from Iraq and Afghanistan. They're needed to simulate nuclear explosions and monitor the nation's aging nuclear warheads. They contribute to the latest forecasts of global warming. And major retailers use them to keep production going so their shelves don't become empty.
Fran Berman, director of the San Diego Supercomputer Center, said one way to think about these high-end tools is to compare them to high-performance race cars.
"It's not easy for you and I to buy an Indy 500 car and to maintain that," she said. "That's where it's important to have government and large-scale investment in these kinds of computers. ... And a real concern from the scientific community right now is that (U.S.) leadership is really falling behind."
In November 2004, Congress passed legislation calling for an additional $165 million a year for research to develop new supercomputers. But President Bush's fiscal 2006 budget didn't allocate any funds. Instead, it requested budget cuts for supercomputing research at the Department of Energy.
Some supercomputer experts say Washington should not - and probably cannot - be the engine that drives innovation.
"I'm not convinced that the federal government, particularly with its current fiscal insolvency, is in the position to do much of anything," said Larry Smarr, director of the California Institute for Telecommunications and Information Technology at UCSD.
The San Diego Supercomputer Center operates on an $80 million annual budget funded by the National Science Foundation. It's one of three federally funded supercomputing centers in the United States designed to give researchers unfettered access to high-end computing.
The two others are the Pittsburgh Supercomputing Center at Carnegie Mellon University and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. Additional supercomputers are run by private industry or federal labs, and their use is restricted.
The San Diego Supercomputer Center offers more than the DataStar, which is worth $17.5 million. Its building also houses several powerful computer systems.
CRISSY PASCUAL / Union-Tribune The San Diego Supercomputer Center on the UCSD campus has enough long-term data storage to hold six petabytes of data. By comparison, all the academic libraries in the country contain about 2 petabytes. |
While DataStar boasts 15.6 teraflops of computing power - that's 15.6 trillion calculations per second - the center's combined computing power totals nearly 28 teraflops. By comparison, a high-performing consumer computer might lumber along at a few billion calculations per second.
The center's long-term data storage system can hold six petabytes, or six quadrillion units of information. All the academic libraries in the nation contain about 2 petabytes.
About 400 people work at the center. They include technicians who run and maintain the biggest machines and computer scientists who team up with researchers to help write the complicated software programs needed for experiments. The center's staff also advises scientists nationwide on how to assemble "mini" supercomputers in their own labs by linking several desktop systems.
The San Diego site "does big data better than any of the other (supercomputer) centers, and that means moving it around quickly from one machine to another, storing it, retrieving it and then finally analyzing it," said Norman, an astronomer at UCSD.
Norman uses supercomputers nationwide to simulate the first several hundred million years of the universe so he can estimate how stars were born.
"Computers are powerful enough ... to allow me to develop models of the universe that are approaching the real complexity of the universe," he said. "It's not some watered-down, fuzzy, simplified version. It's the thing itself."
The supercomputing field faces daunting technical challenges in coming years.
Processing speeds have grown ever faster, but a computer's ability to move data from memory storage to its processor has lagged. Engineers are researching several ways to solve this "memory gap."
Meanwhile, software programs that run today's supercomputers have become increasingly difficult to write.
"In the old days, you were programming for a single processor," Norman said. "Now you're programming for thousands of processors (within a single supercomputer). There's an orchestration."
Despite the difficulties, Norman said he expects to continue riding the wave of ever more powerful computers.
He laughs when he thinks about the supercomputer he worked on 30 years ago as a graduate student at the Lawrence Livermore National Laboratory. That machine performed about 20 million calculations per second and had memory of about 2 megabytes.
In his 30 years of supercomputing, Norman said computing power has doubled about every 18 months.
"Just imagine that you knew in advance that you would have a doubling of your salary every 18 months over 30 years," he said.
"It changes the way you view the future. You plan for growth. You plan for getting more ambitious, and that's what's made this so much fun for me."