Author: Jesse
This is the latest thought from @farzyness , an independent analyst with 360,000 followers who has been investing in Tesla since 2012 and led the team at Tesla from 2017 to 2021.
One person owns a battery company, an AI company, and a rocket company, and they all support each other.
I've been thinking about this for months, and honestly, I really don't see how Musk could lose. This isn't from a "die-hard fan's" perspective, but from a structural point of view. The Tesla-AI-SpaceX triangle is evolving into something unprecedented: an industrial-grade, synergistic, cash-generating flywheel behemoth. While it sounds convoluted, it's a very accurate description.
Let me break down what's going on here, because I think most people are looking at these companies in isolation, while the real focus is on the connections between them.
1. The starting point of the flywheel: energy.
Tesla manufactures batteries, and in massive quantities. They deployed 46.7 gigawatt-hours (GWh) of energy storage systems in 2025, a 48.7% year-over-year increase. Their 50 GWh factory in Houston will begin production this year. Total planned capacity is 133 GWh per year. This business has a gross margin of 31.4%, compared to only 16.1% for the automotive business. This seemingly "boring" energy storage business generates almost twice the profit per dollar of revenue compared to the automotive business.
Why is this important? Because xAI just purchased $375 million worth of Tesla Megapacks (US & Canada energy storage) to power Colossus, the world's largest AI training facility. 336 Megapacks have already been deployed. These batteries provide backup power and demand response capabilities for this system, which houses 555,000 GPUs and consumes over 1 gigawatt of electricity (enough to power 750,000 homes).
2. Breaking free from Nvidia: Chip self-sufficiency
Tesla not only sells batteries, but is also developing its own AI chips.
Currently, Nvidia monopolizes AI training hardware, controlling approximately 80% of the market. All major AI labs (OpenAI, Google, Anthropic, Meta) are vying for Nvidia's quota. The H100 and now the Blackwell chips are the bottleneck for the entire industry. Jensen Huang's pricing power is something most monopolists dream of.
If you were Elon Musk and wanted to build the world's largest AI system, what would you do? You can't rely on Nvidia forever. That's your Achilles' heel, a lever someone else holds in their hands, especially when you plan to power hundreds of millions of robots over the next 10 to 20 years.
Incidentally, Musk's Tesla plan is to create as many robots as there are humans.
Tesla's AI5 chip is slated for release between the end of this year and 2027. Musk claims it will be the world's most powerful inference chip, especially in terms of cost per unit of computing power. In other words, it will boast extremely high efficiency.
Tesla has signed a $16.5 billion foundry contract with Samsung for its AI6 chip. The key point is that Musk stated the AI6 is designed for "Optimus robots and data centers." This means that Tesla products and xAI products will share the same chip.
Nvidia currently wins in "training," but "inference" is the long-term profit driver. Training only happens once, but every time someone uses the model, inference is generated. If you're running millions of Tesla cars, millions of Optimus bots, and billions of Grok queries, inference is where the real computing power demand lies.
By building its own inference chips, Tesla and xAI have "decoupled" from Nvidia, which is focused on training. This is like bypassing a fortified front and flanking the enemy.
3. Space-based AI computing
Musk mentioned "space-based AI computing" in Tesla's Dojo 3 roadmap. Their restart of the Dojo 3 project is precisely for this vision. And when you do the math, this seemingly crazy idea makes perfect sense.
If you wanted to deploy 1 terawatt of AI computing power in space annually (on the scale of global AI infrastructure), according to Musk, at current chip costs, you would need more money than the total amount of currency in existence. The Nvidia H100, priced between $25,000 and $40,000, is simply not economically feasible.
But if you have chips that are extremely low-cost, specifically designed for inference, mass-producible, and highly energy-efficient, the mathematical model changes. Tesla's goal is to manufacture AI chips with "the lowest-cost silicon." This is key to enabling large-scale spatial computing.
Without affordable chips, space AI remains a fantasy; with affordable chips, it becomes an inevitable reality.
Nvidia-backed competitor StarCloud trained its first AI model in space last December. This proved the concept was feasible. Therefore, the focus now is not on validating the hypothesis, but on creating an environment for large-scale deployment.
Imagine this: SpaceX sends orbital data centers into low Earth orbit via Starship, each rocket carrying 100 to 150 tons. These data centers run models developed by xAI, use Tesla-designed chips, and are powered by solar energy and Tesla batteries. Free solar energy, zero-cost cooling. Inference results are transmitted directly to Tesla cars and Optimus robots on Earth via Starlink.
4. Closed Loop of Data and Connections
SpaceX already has nearly 10,000 Starlink satellites in orbit and has been authorized to launch another 7,500. They have 6 million direct-connect mobile phone customers. The V3 satellite launched this year has a downlink capacity of 1 terabit per second (1Tbps), 10 times that of current models.
The flywheel spins wildly here:
xAI builds models (Grok 3 has 3 trillion parameters, Grok 4 won the global test, and Grok 5 with 6 trillion parameters will be released in Q1 2026).
These models are integrated into Tesla vehicles. Grok has been available in-car since July 2025, providing conversation and navigation, and the same Tesla chips are used for the vehicle's autopilot function.
Grok will become the "brain" of the Optimus robot. Optimus plans to produce 50,000 to 100,000 units this year and reach 1 million units by 2027.
This means that: xAI models, Tesla manufactures chips, Tesla manufactures robots to perform the tasks, Tesla manufactures batteries to provide power, SpaceX provides global connectivity and access to space, xAI is trained using all the data from Tesla and SpaceX, and commands are sent from space via solar-powered satellites.
5. An insurmountable moat
This kind of moat is inevitable.
Tesla has 7.1 billion miles of FSD driving data, more than 50 times that of Waymo. Real-world data trains better models, better models improve vehicle performance, and better vehicles collect even more data.
X (formerly Twitter): xAI has exclusive access to real-time human data generated by approximately 600 million monthly active users. This differs from YouTube or search data; it is raw, unstructured, real-time human thought. When Grok experiences hallucinations, they can correct them against real-time consensus faster than anyone else.
What can our competitors use to catch up?
Google has vertical integration (TPU chips, Gemini, YouTube), but Waymo is too small and lacks a launch vehicle and real-time social data stream.
Microsoft has Copilot and Azure, but relies on OpenAI and has no physical hardware, no space infrastructure, and no autonomous driving data.
Amazon has AWS, custom chips, and logistics robots, but lacks consumer AI products with large-scale adoption, a fleet of cars, and launch capabilities.
While Nvidia monopolizes the training layer, it lacks the "physical layer." They don't have cars or robots in factories collecting data, nor a global satellite network. They sell chips, but don't control the application terminals.
To compete with Musk, you would need to simultaneously found or acquire five different top companies, and he consolidates his advantage every day.
in conclusion
Most analysts treat Tesla, xAI, and SpaceX as separate investments, but that's a complete misconception. The value lies not in any single part, but in how they complement each other.
xAI is valued at $250 billion, SpaceX at approximately $800 billion and seeking a $1.5 trillion IPO, and Tesla at $1.2 trillion. The total enterprise value exceeds $2 trillion, and this doesn't even include the premium for synergies.
Each link enhances the other:
Tesla's success provides xAI with more training data.
xAI is a success; Tesla cars and robots are becoming smarter.
SpaceX is successful; the entire system has global coverage.
The energy business was successful, and electricity costs decreased across all facilities.
The chip strategy was successful, freeing them from dependence on Nvidia.
Optimus's success means the total potential labor market (TAM) exceeds $40 trillion annually.
Am I missing something? If you can spot any flaws I haven't seen, I'd love to hear them. Because after observing for so many years, I really can't find a single one.
