Meta and Google announce new in-house AI chips, creating a trillion-dollar question for Nvidia – Fortune

Posted: April 16, 2024 at 10:46 am

Hardware is emerging as a key AI growth area. For Big Tech companies with the money and talent to do so, developing in-house chips helps reduce dependence on outside designers such as Nvidia and Intel while also allowing firms to tailor their hardware specifically to their own AI models, boosting performance and saving on energy costs.

These in-house AI chips that Google and Meta just announced pose one of the first real challenges to Nvidias dominant position in the AI hardware market. Nvidia controls more than 90% of the AI chips market, and demand for its industry-leading semiconductors is only increasing. But if Nvidias biggest customers start making their own chips instead, its soaring share price, up 87% since the start of the year, could suffer.

From Metas point of view it gives them a bargaining tool with Nvidia, Edward Wilford, an analyst at tech consultancy Omdia, told Fortune. It lets Nvidia know that theyre not exclusive, [and] that they have other options. Its hardware optimized for the AI that they are developing.

Why does AI need new chips?

AI models require massive amounts of computing power because of the huge amount of data required to train the large language models behind them. Conventional computer chips simply arent capable of processing the trillions of data points AI models are built upon, which has spawned a market for AI-specific computer chips, often called cutting-edge chips because theyre the most powerful devices on the market.

Semiconductor giant Nvidia has dominated this nascent market: The wait list for Nvidias $30,000 flagship AI chip is months long, and demand has pushed the firms share price up almost 90% in the past six months.

And rival chipmaker Intel is fighting to stay competitive. It just released its Gaudi 3 AI chip to compete directly with Nvidia. AI developersfrom Google and Microsoft down to small startupsare all competing for scarce AI chips, limited by manufacturing capacity.

Why are tech companies starting to make their own chips?

Both Nvidia and Intel can produce only a limited number of chips because they and the rest of the industry rely on Taiwanese manufacturer TSMC to actually assemble their chip designs. With only one manufacturer solidly in the game, the manufacturing lead time for these cutting-edge chips is multiple months. Thats a key factor that led major players in the AI space, such as Google and Meta, to resort to designing their own chips. Alvin Nguyen, a senior analyst at consulting firm Forrester, told Fortune that chips designed by the likes of Google, Meta, and Amazon wont be as powerful as Nvidias top-of-the-line offeringsbut that could benefit the companies in terms of speed. Theyll be able to produce them on less specialized assembly lines with shorter wait times, he said.

If you have something thats 10% less powerful but you can get it now, Im buying that every day, Nguyen said.

Even if the native AI chips Meta and Google are developing are less powerful than Nvidias cutting-edge AI chips, they could be better tailored to the companys specific AI platforms. Nguyen said that in-house chips designed for a companys own AI platform could be more efficient and save on costs by eliminating unnecessary functions.

Its like buying a car. Okay, you need an automatic transmission. But do you need the leather seats, or the heated massage seats? Nguyen said.

The benefit for us is that we can build a chip that can handle our specific workloads more efficiently, Melanie Roe, a Meta spokesperson, wrote in an email to Fortune.

Nvidias top-of-the-line chips sell for about $25,000 apiece. Theyre extremely powerful tools, and theyre designed to be good at a wide range of applications, from training AI chatbots to generating images to developing recommendation algorithms such as the ones on TikTok and Instagram. That means a slightly less powerful, but more tailored chip could be a better fit for a company such as Meta, for examplewhich has invested in AI primarily for its recommendation algorithms, not consumer-facing chatbots.

The Nvidia GPUs are excellent in AI data centers, but they are general purpose, Brian Colello, equity research lead at Morningstar, told Fortune. There are likely certain workloads and certain models where a custom chip might be even better.

The trillion-dollar question

Nguyen said that more specialized in-house chips could have added benefits by virtue of their ability to integrate into existing data centers. Nvidia chips consume a lot of power, and they give off a lot of heat and noiseso much so that tech companies may be forced to redesign or move their data centers to integrate soundproofing and liquid cooling. Less powerful native chips, which consume less energy and release less heat, could solve that problem.

AI chips developed by Meta and Google are long-term bets. Nguyen estimated that these chips took roughly a year and a half to develop, and itll likely be months before theyre implemented at a large scale. For the foreseeable future, the entire AI world will continue to depend heavily on Nvidia (and, to a lesser extent, Intel) for its computing hardware needs. Indeed, Mark Zuckerberg recently announced that Meta was on track to own 350,000 Nvidia chips by the end of this year (the companys set to spend around $18 billion on chips by then). But movement away from outsourcing computing power and toward native chip design could loosen Nvidias chokehold on the market.

The trillion-dollar question for Nvidias valuation is the threat of these in-house chips, Colello said. If these in-house chips significantly reduce the reliance on Nvidia, theres probably downside to Nvidias stock from here. This development is not surprising, but the execution of it over the next few years is the key valuation question in our mind.

Read more from the original source:

Meta and Google announce new in-house AI chips, creating a trillion-dollar question for Nvidia - Fortune

Related Posts