Everyone is talking about chips again, thanks to A.I and a rosy forecast from Nvidia. The news drove investors to flock to A.I.-related stocks to the tune of $300 billion in added value last month.

But all this optimism shouldnt distract us from one of the chip industrys key problems: Chips have stopped providing real jumps in computing power, right as we see an explosion of power-hungry applications like generative A.I.

Historically, the computing power of chips has doubled every two years in what became to be known as Moores Law. But we havent seen that jump in performance for a while. Now, a microprocessors performance increases by only about 10-15% each yearand the actual increase in speed for a given software application is often much smaller. And the process of rearchitecting software for these chips can be expensive and buggy.

This slowdown could not have come at a worse time. Chips are simply not able to keep up with some of the most computation-intensive applications yet seen. The size of models used for tasks like computer vision, natural language processing, and speech processing has increased by 15 times in just two years, an order of magnitude higher than the increase in computer power in chips over the same period. The most advanced machine learning models, like those that power GPT-4 and ChatGPT, have increased by 75 times, again much more than the power of the graphics processing units (GPUs) that underlie them.

The gap between whats needed and whats provided can only be filled by more chips. And thats making computing expensive for everyone. Its now so expensive to build advanced machine learning models that they are now the exclusive domain of rich, powerful corporations.

Why are chips lagging so far behind?

There are technical challenges. Its hard to make chips smaller than they already aretransistors, at their thinnest dimension, are only a few atoms thick.

But its a partial explanation at best. Chips havent kept up with the needs of contemporary applications for quite some timeand even on the best of days, improvements in chip speed have lagged improvements in software algorithms.

A better reason is that the chip industry has not been all that innovative, especially recently. Microprocessors have worked in more or less the same way for 80 years, even as devices get smaller. We havent changed how we use computer memory in decades. And the GPUs that power advanced machine learning also havent changed much in the past 10 years.

Slowing miniaturization is exposing the lack of disruptive ideas in the industry. No chip company appears in recent lists of innovative companies. And the unchanging ranks at the top of the industry suggest an oligopoly.

Innovation needs an ecosystem where companies, typically startups, want to experiment in the hopes of a breakout success.

The chip industry doesnt have many of those experiments.

First, the cost of experimentation is extremely high. It often takes $10-30 million just to get the first product, and another $70-100 million to scale up. These extraordinarily large sums of money discourage risk-taking, entrepreneurship, and funding. As a result, not many chip startups are formed, and the few that get funding come from teams of seasoned chip veterans. This recipe leads to incrementalism, not disruption.

Second, the gestation period for new ideas is too long. It typically takes a few years before the first product sample is created and it may take just as long again to see revenue. This long period, again, discourages both innovators and investors that typically prefer to fail fast.

Third, the chip industry is too consolidated, dropping from 160 companies in 2010 to 97 in 2020. A lack of buyers constrains the size of exits, further discouraging investors.

Chips attract less than 1% of total U.S. venture capital investment, despite the emergence of A.I., the Internet of Things, electric cars, and 5G.

Finally, the chip industry may not be attracting talent. Todays STEM graduates see better prospects in industries with much faster growth (including, perhaps ironically, A.I.). This chip industry also has a branding problemeven chip industry executives agree that the sector has a weak brand. Young employees and future innovators want to tinker with software more than struggle with new hardware.

The U.S. government must use its CHIPS Act funding as a lever to make the chip industry more welcoming to innovation.

To bring down the cost and time needed for a new idea, the government should require recipients of government money to allocate money to more agile methods of hardware methodologies, open-source tools, and open standards. It should require recipients to make commonly-used hardware components widely accessible at a low cost, so that other companies can combine them with innovative components to create new hardware platforms cheaply and quickly.

Academic CHIPS Act beneficiaries should be required to modernize chip design curriculums to emphasize accessibility and impact. 

The government should allocate some National Science and Technology Council funding to develop a shared, subsidized infrastructure for design and fabrication with mature, trailing technologies to reduce the cost of producing proof-of-concept hardware.

And, finally, it can encourage the passage of right-to-repair legislation to help stimulate a culture of tinkering with hardware.

The chip industry has managed to mask its struggle with disruptive innovation for some time. And as miniaturization comes to a close, or at least loses its effectiveness, its time to address the innovation head on. Technological progressliterallydepends on it.

Rakesh Kumar is a Professor in the Electrical and Computer Engineering department at the University of Illinois and author of Reluctant Technophiles: Indias Complicated Relationship with Technology.

The opinions expressed in Fortune.com Commentary pieces are solely the views of their authors, and do not reflect the opinions and beliefs of Fortune.


Newspapers

Spinning loader

Business

Entertainment

POST GALLERY