- AI Weekly
 - Posts
 - Elon and Tesla's Wild New A.I Plan
 
Elon and Tesla's Wild New A.I Plan
Effortless Tutorial Video Creation with Guidde
Transform your team’s static training materials into dynamic, engaging video guides with Guidde.
Here’s what you’ll love about Guidde:
 1️⃣ Easy to Create: Turn PDFs or manuals into stunning video tutorials with a single click.
2️⃣ Easy to Update: Update video content in seconds to keep your training materials relevant.
3️⃣ Easy to Localize: Generate multilingual guides to ensure accessibility for global teams. 
Empower your teammates with interactive learning.
And the best part? The browser extension is 100% free.
hey, josh here. Check this wild story out.
Elon's New Hustle: Your Tesla as a Mobile Server Farm
Listen, Elon Musk just dropped another wild idea during Tesla's Q3 earnings call, and this one's actually kind of brilliant—if you ignore, you know, physics and economics and reality.
The pitch: Tesla's got millions of cars sitting around doing nothing 95% of the time. Each one packs serious AI horsepower. So why not turn that idle fleet into a distributed computing network? Get 100 million Teslas out there, each running about a kilowatt of inference capability, and boom—you've got 100 gigawatts of computing power. "With power and cooling taken care of," as Musk helpfully added.
It's AWS on wheels. A data center you can drive. Your Model 3, moonlighting as a server.
Here's Why This Matters (And Why It Probably Won't Happen)
The backdrop here is insane infrastructure costs. Building AI data centers isn't cheap—large facilities run $500+ million, and the industry needs somewhere between $5-8 trillion in new capacity by 2030. Musk's own xAI is burning through cash expanding its Memphis supercomputer to 1.2 gigawatts of capacity, a project requiring $150-300 million and the power output of a small natural gas plant.
So yeah, finding alternative computing infrastructure? That's a real problem worth solving.
The thing is, distributed computing through parked cars introduces problems that make traditional data centers look simple:
Your Tesla doesn't have enterprise-grade connectivity. It's got 4G, maybe 5G, maybe WiFi if you're home. Data centers use high-speed InfiniBand with sub-millisecond latency. Your car in a parking garage? Good luck maintaining a stable connection for real-time inference tasks.
Then there's the battery situation. Running continuous inference workloads—potentially drawing a kilowatt for hours—means either draining your battery or staying plugged in constantly. Tesla batteries lose 15% capacity after 200,000 miles under normal use. What happens when you're running server tasks 23 hours a day? And who pays for the electricity? The degradation? The cooling?
The math gets brutal fast. Even at Tesla's optimistic production target of 3 million vehicles annually, hitting 100 million cars would take 20+ years from now. The upcoming AI5 chip that makes this theoretically possible won't hit mass production until 2026 or 2027. We're talking a 2030s timeline at the earliest.
What's Really Going On?
This feels less like a genuine product roadmap and more like Musk doing what Musk does: painting a vision that makes Tesla's asset base—that massive, growing vehicle fleet—look more valuable to investors. It's not dishonest exactly, but it's strategic storytelling.
The technical challenges aren't just engineering problems to solve. They're fundamental architectural mismatches. Successful distributed computing projects like SETI@home worked because they tolerated latency and node failures. Commercial AI inference needs reliability guarantees that cars parked in random locations simply can't provide.
Could Tesla use its fleet for some distributed computing tasks? Sure. Latency-tolerant workloads, supplementary capacity, edge processing—there are applications that might work by the mid-2030s. But the grand vision of 100 gigawatts powering xAI's next-generation models?
That's Elon building in public again. And honestly? The hype might be worth more than the hardware ever will be.


Reply