MIT expert on powerful computers and innovation

: A new working paper attempts to quantify the importance of more powerful computers in improving outcomes in society. In it, the researchers analyzed five areas where computation is essential, including weather forecasting, oil exploration, and protein folding (important for drug discovery). Credit: MIT

Q&A: MIT’s Neil Thompson on Computing Power and Innovation

Innovation in many industries has been fueled by rapid increases in microchip speed and power, but the future trajectory of these incredible advancements may be in jeopardy.

Gordon Moore, co-founder of Intel, predicted that the number of transistors on a chip would double every one or two years. This prediction is known as Moore’s Law. Since the 1970s, this prediction has been mostly met or exceeded; processing power doubles approximately every two years, while better and faster microchips become more affordable.

For many years, this exponential increase in computer power has spurred innovation. However, at the beginning of the 21st century, researchers began to worry about the slowing down of Moore’s Law. There are physical restrictions on the size and number of transistors that can be crammed into an inexpensive microprocessor using current silicon technology.

To measure the value of more powerful computers in improving outcomes across society, Neil Thompson, a researcher at MITThe Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Sloan School of Management, and its research group, set out to do just that. They looked at five areas where computing is essential, such as weather forecasting, oil exploration and protein folding (important for drug discovery), in a recent working paper. Gabriel F. Manso and Shuning Ge, two research assistants, are co-authors of the working paper.

They found that the contribution of processing power to these advances ranged from 49 to 94 percent. For example, increasing the power of computers by ten improves three-day weather forecasts by a third of a degree.

However, technological advancements in the field of computers are lagging behind, which could have significant effects on the economy and society. Thompson discussed this study and the effects of Moore’s Law demise in an interview with MIT News.

Q: How did you approach this analysis and quantify the impact of IT on different areas?

A: Quantifying the impact of IT on actual results is tricky. The most common way to look at computing power, and computer advancements more generally, is to look at business spending and see how it correlates with bottom line results. But spending is a difficult metric to use because it only partially reflects the value of the computing power purchased. For example, today’s computer chip may cost the same as last year’s, but it’s also much more powerful. Economists are trying to adjust to this change in quality, but it’s hard to know exactly what that number should be. For our project, we measured computing power more directly, for example by examining the capabilities of the systems used when protein folding was first performed using deep learning. By looking directly at capacities, we can get more accurate measurements and thus get better estimates of the influence of computing power on performance.

Q: How do more powerful computers improve weather forecasting, oil exploration and protein folding?

A: The short answer is that the increase in computing power has had a huge effect on these areas. With the weather forecasts, we have seen that the computing power used for these models has increased by a trillion times. This puts the increase in computing power and how we have exploited it into perspective. This is not someone who just takes an old program and puts it on a faster computer; instead, users must constantly rethink their algorithms to take advantage of 10 or 100 times more computing power. There’s still plenty of human ingenuity to devote to improving performance, but our results show that much of that ingenuity is focused on how to harness ever more powerful computing engines.

Oil exploration is an interesting case because it gets harder over time as the easy wells are drilled, so what’s left is harder. Oil companies are fighting this trend with some of the world’s largest supercomputers, using them to interpret seismic data and map subsurface geology. This helps them do a better job of drilling in exactly the right place.

Using computing to improve protein folding is a long-standing goal because it is crucial to understanding the three-dimensional shapes of these molecules, which in turn determine how they interact with other molecules. In recent years, AlphaFold systems have made remarkable breakthroughs in this area. What our analysis shows is that these improvements are well predicted by the massive increases in computing power they use.

Q: What were the biggest challenges in carrying out this analysis?

A: When looking at two trends that develop over time, in this case performance and computing power, one of the biggest challenges is disentangling which relationship between them is causality and which is not. actually just a correlation. We can answer this question, in part, because in the areas we’ve studied, companies invest huge amounts of money, so they do a lot of testing. In weather modeling, for example, they don’t just spend tens of millions of dollars on new machines and then hope they work. They do an evaluation and find that running a model for twice as long improves performance. Then they buy a system powerful enough to perform this calculation in a shorter time so that they can use it operationally. It gives us a lot of confidence. But there are also other ways of looking at causation. For example, we see that there have been a number of big jumps in the computing power used by NOAA (the National Oceanic and Atmospheric Administration) for weather forecasting. And, when they bought a bigger computer and installed it all at once, the performance really increased.

Q: Would this progress have been possible without an exponential increase in computing power?

A: This is a tricky question because there are many different inputs: human capital, traditional capital and also computing power. All three change over time. You could say that if you have a trillion-dollar increase in computing power, surely that has the greatest effect. And that’s a good hunch, but you also have to consider diminishing marginal returns. For example, if you go from no computer to one computer, that’s a huge change. But if you go from 100 computers to 101, this additional computer does not bring as much gain. So there are two competing forces: big increases in computing on the one hand, but diminishing marginal benefits on the other. Our research shows that while we already have tons of computing power, it’s growing so quickly that it’s a big reason for the improved performance in these areas.

Q: What are the implications of slowing down Moore’s Law?

A: The implications are quite disturbing. As computing improves, it improves weather forecasting and other areas that we have studied, but it also improves countless other areas that we have not measured but are nonetheless essential parts of our economy. and our society. If that buff engine is slowing down, that means all of those tracking effects are also slowing down.

Some might disagree, arguing that there are many ways to innovate – if one lane slows down, others will compensate. On some level, that’s true. For example, we are already seeing increased interest in the design of specialized computer chips as a means of offsetting the end of Moore’s Law. But the problem is the magnitude of these effects. The gains of Moore’s Law were so great that, in many application areas, other sources of innovation will not be able to compensate.

Reference: “The Importance of (Exponentially More) Computing Power” by Neil C. Thompson, Shuning Ge and Gabriel F. Manso, June 28, 2022, Computing > Hardware Architecture.
DOI: 10.48550/arXiv.2206.14007