On March 14, Tesla CEO Elon Musk confirmed that the company’s long-teased Terafab project, a massive artificial intelligence (AI) chip fabrication initiative, will launch in seven days, placing the expected start date around March 21, 2026.
The announcement highlights Tesla’s strategy to gain tighter control over the AI hardware powering its vehicles and robotics ambitions.
TL;DR
- Tesla’s Terafab AI chip factory is expected to launch around March 21, 2026.
- The project aims to produce 100 to 200 billion AI and memory chips annually.
- Tesla’s AI5 chip will power Full Self-Driving, Optimus robots, and Dojo supercomputers.
- The factory could cost roughly $20 billion and target cutting-edge 2nm chip production.
Tesla Prepares To Launch Terafab To Secure AI Chip Supply
The Terafab initiative represents a vertically integrated semiconductor manufacturing facility designed to handle logic chip production, memory, and advanced packaging in one location.
Tesla first formally confirmed the project during its January 28, 2026 earnings call. During the call, Musk explained that the company expects chip supply shortages within three to four years, making it necessary to build its own fabrication infrastructure.
However, the idea dates back even further. During Tesla’s annual general meeting the previous year, Musk warned that relying solely on existing suppliers would not provide enough chips to meet Tesla’s long-term AI needs.
“Even when we extrapolate the best-case scenario for chip production from our suppliers, it's still not enough,” Musk said at the time.
Inside Tesla’s AI5 Chip Strategy For Autonomous Systems
At the center of Terafab is Tesla’s next-generation AI5 chip, a processor designed to dramatically increase the computing capabilities behind Tesla’s autonomous technologies.
The chip is expected to deliver 40 to 50 times more compute performance and nine times more memory compared to Tesla’s current AI4 hardware. These gains are critical for improving Full Self-Driving software, enabling more advanced autonomy in Tesla vehicles, and supporting the company’s expanding robotics and AI ecosystem.
Tesla’s Optimus humanoid robots and the company’s Dojo supercomputer are also expected to benefit from the new chips.
The facility itself could reach an enormous scale. Estimates suggest Terafab may produce between 100 billion and 200 billion AI and memory chips annually while targeting around 100,000 wafer starts per month.
Tesla is also reportedly targeting 2-nanometer process technology, placing the project among the most advanced semiconductor manufacturing efforts currently in commercial development.
Terafab May Turn Tesla Into A Major AI Chip Producer
If Tesla successfully builds and operates Terafab at scale, the implications could extend far beyond the company’s own products.
Only a handful of organizations globally have the capability to manufacture advanced AI silicon at leading-edge nodes. By developing an in-house fabrication facility, Tesla could secure its supply chain while potentially positioning itself as a future supplier or licensor of AI chips to other industries.
Tesla has already been working with semiconductor giants such as TSMC and Samsung on AI chip production. Musk has also previously suggested the company could explore discussions with Intel regarding potential manufacturing collaboration.
“You know, maybe we'll, we'll do something with Intel,” Musk said previously, while noting that no formal agreement had been signed.
For Tesla, the strategy is about scale and independence. Musk summarized the challenge clearly when discussing the project’s necessity.
“I think we may have to do a Tesla terafab. It's like giga but way bigger. I can't see any other way to get to the volume of chips that we're looking for. So I think we're probably going to have to build a gigantic chip fab. It's got to be done,” he said.
With Terafab potentially costing around $20 billion and producing hundreds of billions of chips annually, Tesla’s next big factory may not build cars at all. Instead, it could manufacture the silicon powering the next generation of artificial intelligence.

Join The Discussion