The Importance of the Federal Government's Role in High-Performance Computing
Sig Hecker
Siegfried S. Hecker is the Director of Los Alamos National Laboratory, Los Alamos, New Mexico. For more complete biographical information, please refer to Dr. Hecker's presentation in Session 1.
As I look at the importance of the federal government's role, I think I could sum it all up by saying that Uncle Sam has to be a smart businessman with a long-term outlook because his company's going to be around for a long time. Someone has to take that long-term outlook, and I think clearly that's where the federal government has to come in.
What I want to discuss today is specifically the role of Los Alamos National Laboratory (Los Alamos, or the Laboratory) and how we'd like to respond to the High Performance Computing Initiative. I think most of you know that what we are interested in first and foremost at Los Alamos is solutions. We have applications, and we want the best and the fastest computers in order to be able to do our jobs better, cheaper, and faster and, hopefully, to be able to do things that we had not been able to do before.
Our perspective comes from the fact that we want to do applications; we're users of this computing environment. So we've always considered it imperative to be at the forefront of computing capability.
Let me review briefly the important roles that I think we've played in the past and then tell you what we'd like to do in the future. It's certainly fair to say that we've played the role of the user—a sophisticated,
demanding user. In that role we have interfaced and worked very, very closely with the computing vendors over the years, starting early on with IBM and then going to Control Data Corporation, to Cray Research, Inc., to Thinking Machines Corporation, and then of course all along with Sun Microsystems, Inc., Digital Equipment Corporation, and so forth.
Out of necessity, in a number of cases we've also played the role of the inventor—in the development of the MANIAC, for instance, right after development of the ENIAC. Our people felt that we had to actually create the capabilities to be able to solve the problems that we had.
Later on, we invented things such as the common file system. The high-performance parallel interface, better known as HIPPI, is also a Los Alamos invention. That's the sort of product that's come about because we're continually pushed by the users for this sort of capability.
New algorithms to solve problems better and smarter are needed. So things like the lattice gas techniques for computational fluid dynamics were basically invented here by one of our people, along with some French collaborators. Also, we are very proud of the fact that we helped to get three of the four NSF supercomputer centers on line by working very closely with them early on to make certain that they learned from our experiences.
We also introduced companies like General Motors to supercomputing before they bought their computers in 1984. We were working with them and running the Kiva code for combustion modeling. As Norm Morse points out (see Session 10), we have 8000 users. At least half of them are from outside the Laboratory.
The role that we've played has been made possible by our feeling that we have to be the best in the defense business. Particularly in our mainline business, nuclear weapons design, we felt we needed those capabilities because the problems were so computationally intense, so complex, and so difficult to test experimentally. We were fortunate for many, many years that first the Atomic Energy Commission and then the Department of Energy (DOE) had the sort of enlightened management to give us the go-ahead to stay at the forefront and, most importantly, to give us the money to keep buying the Crays and Thinking Machines and all of those good machines.
What we have proposed for the Laboratory is an expanded national charter under the auspices of this High Performance Computing Initiative. First of all, our charter has already significantly expanded beyond nuclear weapons R&D, which represents only about a third of our
activities. The remaining two-thirds is a lot of other defense-related activities and many civilian activities.
Today, in terms of applications, we worry about some of the same grand challenges that you worry about, such as environmentally related problems—for instance, the question of global climate change. The Human Genome Initiative is basically an effort that started at Los Alamos and at Lawrence Livermore National Laboratory because we have the computational horsepower to look at how one might map the 3 billion base pairs that exist on your DNA. We also have other very interesting challenges in problems like designing a free-electron laser essentially from scratch with supercomputers.
In response to congressional legislation earlier this year, I outlined a concept called Collaborative R&D Centers that I'd like to see established at Los Alamos or at least supported at places like Los Alamos. There are several aspects of this proposed center I would like to mention. For one thing, we'd like to make certain that we keep the U.S. at the leading edge of computational capabilities. For another, we intend to make the high-performance computing environment available to greater numbers of people in business, industry, and so forth.
But there are five particular things I'd like to see centers like this do. First of all, continue this very close collaboration with vendors. For instance, at Los Alamos we're doing that now, not only with Cray but also with IBM, Thinking Machines, and many others.
Second, continue to work, perhaps even closer, with universities to make sure that we're able to inject the new ideas into the future computing environment. An example of that might be the work we've done with people like Al Despain at the University of Southern California (see Session 4) as to how one takes the lattice gas concepts and constructs a computational architecture to take advantage of that particular algorithm. Despain has thought about how to take one million chips and construct them in such a fashion that you optimize the problem-solving capabilities.
As part of this collaboration with universities, we could provide a mechanism for greater support, through DOE and Los Alamos, of graduate students doing on-campus research, with provisions for work at the Laboratory, itself. We do a lot with graduate students now. In fact, we have about 400 graduate students here during the course of a summer in many, many disciplines. I think in the area of computational sciences, we will really boost student participation.
The third aspect is to have a significant industrial user program to work even more closely with U.S. industry—not only to make available to them supercomputing but also to promote appreciation of what supercomputing and computational modeling can do for their business. So we'd like to have a much greater outreach to U.S. industry. I agree with the comments of other presenters that supercomputing in industry is very much underutilized. I think one can do much better.
The fourth aspect would be to help develop enabling technologies for tomorrow's innovations in computing—technologies such as photolithography (or at least the next generations of photolithography), superconducting microelectronics, optical computers, neural networks, and so forth. In the Strategic Defense Initiative (SDI) program, where we've done a lot of laser development, we'd like to provide essentially the next "light bulb" for photolithography—that "light bulb" being a free-electron laser we've developed for SDI applications.
The benefit of the free-electron laser is that you can do projection lithography, that you can take the power loss because you start with extremely high power, and that you can tune to wavelength. We think that we can probably develop a free-electron laser, tune it down to the order of 10 nanometers, and get feature sizes down to 0.1 micron—perhaps 0.05 microns—with that sort of a light bulb. It would take a significant development to do that, but we think it's certainly possible. We're working right now with a number of industrial companies to see what the interest level is so that we might be able to get beyond what we think can be done with X-ray synchrotron proximity lithography. It's that type of technology development that, again, I think would be a very important feature of what laboratories like ours could do.
The fifth aspect would be a general-user program to make certain that we introduce, as much as is possible and feasible, some of these capabilities to the local communities, schools, businesses, and so forth. This collaborative R&D would have a central organization in a place like Los Alamos, but there would be many arms hooked together with very-high-speed networks. It would also be cost-shared with industry. The way that I see this working, the government invests its money in us to provide this capability. Industry invests its money and shows its interest by providing its own people to interact with us.
These are just a few remarks on what I think a laboratory like Los Alamos can do to make certain that this country stays at the leading edge of computing and that high-power computing is made available to a broader range of users in this country.