Meta Launches In-House AI Chips After Nvidia, AMD Deals

House AI Chips After Nvidia

On Wednesday, Meta Platforms said it has developed four specially designed in-house AI chips used to perform artificial intelligence jobs. The relocation is among the strategies of the company to scale up its data centers on large scale basis. It also highlights the increasing investments by Meta on the architecture of AI and high-level computing.

These new silicon chips are Meta Training and Inference Accelerator (MTIA). In 2023, the company unveiled the MTIA platform, and it released a second-generation one in 2024. The chips are designed to enhance the functionality of AI systems in all services that are offered by Meta.

According to the Vice President of the engineering at Meta, Yee Jiun Song, the ability to design AI chips in-house enables the company to maximize the cost and performance. Taiwan Semiconductor Manufacturing Company also known as TSMC manufactures the chips.

Meta can lower the behavior of its chip suppliers by making its own silicon. This is also a strategy to diversify its supply chain, and also allows the company to cushion against fluctuations in price in the semiconductor market.

Meta Expands AI Infrastructure With MTIA Chip Family

MTIA 300 was the first new chip, deployed some few weeks ago. Song said the chip trains smaller AI models that power Meta’s ranking and recommendation systems.

Such systems dictate what people will see on their apps like Facebook and Instagram. The models also support advertising systems by personalizing content for billions of users.

Meta is developing three other AI chips- MTIA -400, MTIA -450 and MTIA -500. The new processors will be used to address new generative-AI inference processes, like generating images or videos based on some written text. 

The company confirmed that the MTIA 400 completed testing and is moving toward deployment in its data centers. Meta expects the MTIA 450 and MTIA 500 to enter operation by 2027.

Big Tech Race to Build Custom AI Chips

The chip development cycle rate at the rapidly growing company of chip development, Meta, a chip company as described by Song, is very odd because, according to him, the company releases a new silicon.

Such rapid development indicates the expansion of AI infrastructure of Meta. The company continues to spend a lot of money on capital expenditures in order to increasing its capacity in its data centers. Large technology companies are also incorporating the design of their own chips. As an example, Google is building in-house silicon to decrease its dependence on Nvidia and AMD GPUs.

The application-specific integrated circuits (ASICs) that are typically constructed by these companies are application specific. In contrast to general-purpose GPUs, ASICs are workload specific, which gives them a higher level of efficiency and reduces their prices.

Read : Hospitality Leadership Reset: Lessons From 99 Women Leaders

Supply Chain Challenges and Future AI Plans

The fast development of AI is testing the worldwide supply chain of semiconductor. One of the problems is the lack of high-bandwidth memory (HBM), which drives the AI workloads of high capabilities.

Meta is keeping an eye on the supply situation, according to Song. He also said that the company believes they possess sufficient memory to support its present expansion strategies. Suppliers The common suppliers of memory chips include Samsung Electronics, SK Hynix, and Micron Technology that tend to provide components through short term contracts.

In the recent past, Meta has been entering into agreements to purchase millions of GPUs in Nvidia. It also signed agreements with up to six gigawatts of GPUs of AMD within a period of years. Taiwan Semiconductor has factories in Taiwan and Arizona and manufactures the new chips of Meta. The majority of the project engineers are located in the United States.

Out of 30 operational and planned data centers of Meta, 26 belong to the U.S., which highlights a high level of investment of the company in AI infrastructure at home.

Share Now

Related Articles

Saipem Pilots AI on Ultra-Deepwater Vessel
Saipem Pilots AI on Ultra-Deepwater Vessel
Artificial Intelligence-Report finds tech
Report Finds Tech Industry AI Climate Claims Frequently Exaggerating Benefits
Artificial Intelligence-AI Agents
AI Agents Go Social — Inside the New “Moltbook” Network Where Bots Chat, Complain, and Surprise Humans
AI Stocks
AI Stocks Lead Early Gains as 2026 Markets Open
Artificial Intelligence-OpenAI
OpenAI Launches ChatGPT App Store Opens Developer Submissions Inside ChatGPT Platform

You May Also Like

Why Health Care in America
House AI Chips After Nvidia
Hospitality Leadership Reset
Novo Nordisk Drops Patent Case
Scroll to Top