New post

In partnership with

Find your customers on Roku this Black Friday

As with any digital ad campaign, the important thing is to reach streaming audiences who will convert. To that end, Roku’s self-service Ads Manager stands ready with powerful segmentation and targeting options. After all, you know your customers, and we know our streaming audience.

Worried it’s too late to spin up new Black Friday creative? With Roku Ads Manager, you can easily import and augment existing creative assets from your social channels. We also have AI-assisted upscaling, so every ad is primed for CTV.

Once you’ve done this, then you can easily set up A/B tests to flight different creative variants and Black Friday offers. If you’re a Shopify brand, you can even run shoppable ads directly on-screen so viewers can purchase with just a click of their Roku remote.

Bonus: we’re gifting you $5K in ad credits when you spend your first $5K on Roku Ads Manager. Just sign up and use code GET5K. Terms apply.

Hey, Josh here. Check out these two trending stories from today.

When Robots Meet Satellites: The Wild Physics of Scaling AI

while most of us were doom-scrolling, two companies dropped announcements that reveal something crucial about where AI is actually heading. And it's not where you think.

XPeng unveiled their IRON humanoid robot—173cm tall, 70kg, with 22 degrees of freedom in each hand. Google announced Project Suncatcher—a plan to launch AI data centers into orbit by 2027. On the surface, these seem completely unrelated. But they're solving the exact same problem from opposite ends: AI has gotten so hungry for compute and power that we're hitting physical limits on Earth.

Let's break it down.

The Robot Reality Check

XPeng's CEO did something refreshingly honest. He admitted they tried replacing human assembly workers with IRON for a year—just for the "easiest" task of tightening screws. The result? Too expensive, too fragile, constant repairs. His conclusion: humanoid robots won't meaningfully replace factory workers for 3-5 years, and won't be safe in your home for 5-10 years.

This matters because while Tesla and others are hyping 2026 mass production, XPeng—who actually tried this—is pumping the brakes. They're starting with retail greeters and receptionists in their own stores. Not sexy, but smart.

The tech is genuinely impressive: IRON runs three different AI models simultaneously (vision-language-task, vision-language-action, and vision-language-model) on three proprietary Turing chips delivering 2,250 TOPS of processing. It's powered by an all-solid-state battery—the first in humanoid robotics. The hands can each lift 3kg with precise finger control. During the demo, rehearsal staff literally couldn't tell if a human or robot was walking on stage.

But here's the kicker: this level of AI sophistication requires massive computing infrastructure. Training models like IRON's multi-brain system needs data centers consuming megawatts of power. Which brings us to space.

The Orbital Power Move

Google's Project Suncatcher isn't science fiction—it's engineering pragmatism disguised as moonshot thinking. The pitch is elegantly simple: solar panels in space generate 8x more power than on Earth because there's no atmosphere, no clouds, no night. Position satellites in sun-synchronous orbit and they get sunlight nearly 24/7.

The company already tested their Trillium TPU chips in a particle accelerator simulating five years of space radiation. They survived—even the sensitive memory components only failed after three times the expected dose. They've built a lab prototype achieving 1.6 terabits per second between satellites using laser links. They're partnering with Planet Labs to launch two test satellites by early 2027.

Google's analysis suggests that by 2035, when launch costs drop below $200/kg, space-based data centers become economically competitive with ground facilities. The constraint? You need satellites flying in extremely tight formation—kilometers or less apart—to achieve data-center-level bandwidth without melting your transmitters.

Why This Actually Matters

By 2035, AI data centers in the US alone will need 123 gigawatts of power—up from 4 gigawatts in 2024. That's a 30-fold increase in a decade. Microsoft is literally sitting on GPU stockpiles because they don't have electricity to run them. This isn't theoretical scarcity—it's operational reality.

Meanwhile, the humanoid robot market is projected to hit $81 billion by 2035, with China leading deployment (610 robotics investment deals totaling $7 billion in the first nine months of 2025 alone). XPeng is committing $13.8 billion over two decades to robot development.

What's fascinating is the convergence: XPeng is adapting autonomous driving AI for robots. Tesla's doing the same with Optimus. The bet is that a single powerful AI can master physical navigation whether on wheels or legs. But training that AI? That requires the compute infrastructure Google's trying to build.

Here's the synthesis: We're not just building smarter robots—we're rebuilding the entire tech stack from chips to satellites to make embodied AI possible at scale. XPeng's realistic timeline (3-10 years) reflects hard lessons from autonomous vehicles, where full autonomy took longer than anyone predicted. Google's 2027 prototype and 2035 viability targets come from actual radiation testing and orbital mechanics modeling.

The companies racing toward AGI discovered something uncomfortable: you can't just make AI smarter—you need to fundamentally rethink where and how you generate the power to run it.

Turns out the biggest constraint on artificial intelligence isn't artificial at all. It's physics.

Reply

or to participate.