Meta laid out four MTIA accelerators to scale data centers for Facebook and Instagram, while still spending big on Nvidia and AMD chips.
What’s going on here?
Meta laid out a fast release plan for four homegrown AI accelerator chips through 2027, while budgeting up to $135 billion in capital spending this year to expand the data centers behind Facebook and Instagram.
What does this mean?
Meta’s Training and Inference Accelerator (MTIA) effort is a hedge against a basic AI headache: buying only off-the-shelf chips can be pricey and supply can tighten when demand surges. Its first chip is already handling ranking and recommendations, and the next versions are timed about every six months. The roadmap focuses on inference – the “answer users fast” phase – with later chips designed to run AI features at huge scale. Training, the heavier lift used to build frontier models, still looks like Meta’s weak spot, so it’s continuing to lean on nvidia and amd for the most demanding work. Meta also says it’s now designing more of the surrounding system, including multi-rack setups and liquid cooling, which shows the bottleneck is shifting from just chips to power, heat, and deployment speed.
Why should I care?
For markets: Custom silicon is turning into a second supply chain.
This is the same playbook other hyperscalers use: build some in-house chips to lower long-run costs and reduce vendor reliance, without cutting purchases overnight. Meta’s $115 billion to $135 billion capex guide suggests plenty of near-term spend on third-party accelerators and networking, supporting demand for incumbent suppliers even as competition slowly builds.
Zooming out: Data centers are becoming integrated products not just buildings.
Once power and cooling become hard constraints, the “product” is the whole stack – chip, server, rack, and thermal design – not just the building. Meta’s reliance on partners like Broadcom and foundries like TSMC also underlines how advanced manufacturing stays concentrated. If inference keeps growing faster than training, more platforms may favor purpose-built accelerators to keep AI features responsive and costs predictable.
—
Originally Posted March 11, 2026 – Meta’s Homegrown AI Chips Are Coming In A Rapid Cadence
Disclosure: Interactive Brokers Third Party
Information posted on IBKR Campus that is provided by third-parties does NOT constitute a recommendation that you should contract for the services of that third party. Third-party participants who contribute to IBKR Campus are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.
This material is from Finimize and is being posted with its permission. The views expressed in this material are solely those of the author and/or Finimize and Interactive Brokers is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to buy or sell any security. It should not be construed as research or investment advice or a recommendation to buy, sell or hold any security or commodity. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.


















Join The Conversation
If you have a general question, it may already be covered in our FAQs page. go to: IBKR Ireland FAQs or IBKR U.K. FAQs. If you have an account-specific question or concern, please reach out to Client Services: IBKR Ireland or IBKR U.K..
Visit IBKR U.K. Open an IBKR U.K. Account