Ondrej Burkacky: Chip drought to last years, with recession a factor

Ben Wodecki, Jr. Editor

October 14, 2022

5 Min Read

Ondrej Burkacky: Chip drought to last years, with recession a factor

The global chip shortage is approaching its third year, fueled by increased demand for semiconductors in a world that is fast transforming digitally.

And as the pandemic wanes, several governments are stepping into the gap to invest in chip plants as they anticipate even greater demand ahead, especially for advanced chips needed for AI.

The U.S., EU, China and South Korea are all vying for market dominance – dangling carrots and threatening sticks to get the chipmakers onside.

The EU, for example, has offered incentives to get businesses building fabrication plants, or fabs, in the bloc. So too has the U.S., although the Biden government has blocked manufacturers from exporting certain chips and chipmaking equipment to China.

To make sense of the situation, AI Business recently sat down with Ondrej Burkacky, global co-lead of McKinsey's semiconductors practice. He sketched the landscape of the ongoing chip shortage and explained the extent to which it is impacting AI performances.

AI Business asked him: Is the crisis over? When will consumers be able to purchase a PlayStation 5 again?

The following is an edited transcript of that conversation. You can listen to the full chat in the latest episode of the AI Business Podcast below, or wherever you get your podcasts.

AI Business: How are you viewing the wider interest from global governments in semiconductors?

Ondrej Burkacky: The shortage has created a lot of discussions around semiconductors that were not there before. This is also true for several OEMs (original equipment manufacturers) that are not suffering from a shortage. They know that they need semiconductors, but it was something that was always there, and people relied on the performance to improve year over year and the cost to decrease year over year.

Now companies and governments realize that disruptions in that supply chain have a huge ripple effect. We analyzed that one wafer actually (relates) to six jobs in the U.S. automotive industry and four jobs in the European automotive industry. There is a huge dependency also for the economy. And there's a lot more pace on these programs and also more scale on supporting semiconductors, both looking at it from a manufacturing perspective, but also from an IP research perspective.

AI Business: How impactful are micro trends on the situation at large? From things like crypto bros buying all the graphics cards to industrial action in places like Korea?

Burkacky: For any big technology or trend in any industry there is a very high likelihood that you need more semiconductors because of that. When we look at autonomous driving, more semiconductors are needed for more driver assistance. Electrification of cars, more semiconductors are needed in an electric powertrain compared to a combustion powertrain.

So this means (there will be increased demand for) car semiconductors per vehicle to 2x from now to 2030. Industry 4.0, (also needs) more semiconductors. 5G in smartphones? 20% more silicon space needed in a 5G phone compared to a 4G phone. And crypto is also one thing where you need specific ASICs for crypto mining.

… In the bigger scheme of things, yes, there is a certain increase in demand. But it's not that significant compared to the overall trends. We saw a jump in growth in 2020 year over year. … People still thought the semiconductor industry is going to grow by 5% and ultimately grew by 9%. It's almost doubling in growth. You cannot only attribute it to some cryptocurrency chipsets that are built - the technology transition we were on just got on steroids and just went much faster than anticipated.

AI Business: Has the chip shortage impacted AI performance increases?

Burkacky: The chips that are the shortest in supply are in the so-called nanometer range, like 40 to 65 nanometers. … These more general purpose controllers, a compute unit, are not used for AI because their compute power is not optimized for high-performance compute, it's a well-balanced power versus compute ratio that they take. We also have shortages of some power semiconductor voltage converters.

Where we have the least shortage comparably is on the leading edge side – all the data center compute chips and high-end AI accelerators, graphic cards we have less of a shortage now. … There the trend is less concerned around the shortage, but it's about performance. In the end, for any high-load AI application, you need the best performance you can get, because that significantly shrinks the time to process, and the value and everything gets improved.

The trend there is more around going from generic GPUs to do AI, to designing custom chipsets that are very specifically designed for one task − bitcoin mining, autonomous driving, speech recognition. And with those specific designs, I would expect that it's less the capacity situation that is going to limit the way people can come out with these chipsets. But we are most likely running into a shortage of labor of skilled designers. Because before, 1,000 companies would use the same chip to do the AI programming on top. If not all of those 1,000 companies go for an OEM chipset design, you need many more chip designers and we might be short of these. So I would be less afraid of a supply shortage in terms of manufacturing on the leading edge side, rather than a designer shortage to design all these ASICs, specific for AI applications.

AI Business: When do you foresee an end to the chip shortage?

Burkacky: The answer for me is more in the details than generally saying this is going to be over in 12 months or two years. It depends on what you look at. The leading edge shortage is actually not really a shortage. Even if you cannot get your latest and greatest equipment often enough, you cannot get it because the power supply is not available, not because the high-end graphic processor is not available.

And looking at that, I actually don't see the shortage going away to the extent that we would declare victory in the next two to three years. It's more after that because it takes time to build additional capacity and build additional fabs to provide the chipsets because I don't see the demands (going away) yet. If a real economic downturn comes, for sure, then we might have a problem.

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like