• AI Weekly
  • Posts
  • OpenA.I Just Exposed Their Biggest Enterprise Users

OpenA.I Just Exposed Their Biggest Enterprise Users

In partnership with

The Gold standard for AI news

AI keeps coming up at work, but you still don't get it?

That's exactly why 1M+ professionals working at Google, Meta, and OpenAI read Superhuman AI daily.

Here's what you get:

  • Daily AI news that matters for your career - Filtered from 1000s of sources so you know what affects your industry.

  • Step-by-step tutorials you can use immediately - Real prompts and workflows that solve actual business problems.

  • New AI tools tested and reviewed - We try everything to deliver tools that drive real results.

  • All in just 3 minutes a day

OpenAI Just Exposed Their Biggest Users

OpenAI did something wild last week: they basically revealed their entire customer leaderboard.

At DevDay 2024, the company handed out physical awards to the 30 customers who've each burned through over 1 trillion tokens on their AI models. Yes, you read that right. Trillion. With a T.

Think of tokens as the "compute currency" of AI—every ChatGPT query, every AI-generated email, every code suggestion eats tokens. A trillion tokens is roughly equivalent to processing 750 million pages of text. And these 30 companies have each done that.

But here's what makes this fascinating: the list isn't what you'd expect. Sure, there are AI startups. But the real story? More than half are boring, scaled companies you'd never associate with bleeding-edge AI. We're talking about language learning apps, telecom providers, and e-commerce platforms quietly becoming AI superpowers.

This wasn't a leak. This was OpenAI flexing.

The Trophy System: Silicon Valley Meets Little League

OpenAI didn't just announce these customers—they literally gave them trophies.

Physical awards. With precision-milled aluminum "tokens" embedded in resin. Three tiers: silver for 10 billion tokens, black for 100 billion, blue for 1 trillion.

It's simultaneously the most Silicon Valley thing ever and also kind of brilliant marketing. Because when you hand out plaques commemorating billion-token milestones, you're doing three things:

  1. Demonstrating infrastructure capacity: "Look how much AI we can handle without breaking"

  2. Creating FOMO: Every CTO not on this list is now asking their team "why aren't we using this much AI?"

  3. Normalizing massive AI spend: A trillion tokens costs somewhere between $150,000 and $300,000 depending on the model. OpenAI just made that seem like table stakes.

The whole thing has "we're the only game in town and we know it" energy.

Number One Will Shock You (Just Kidding, It's Duolingo)

The top user? Duolingo.

The green owl that guilt-trips you about Spanish lessons is OpenAI's biggest customer.

And when you think about it for 30 seconds, it makes perfect sense. Duolingo isn't just using AI for chatbots. They've essentially rebuilt their entire product around GPT-4:

  • Every language lesson can now be generated automatically

  • Conversation practice with AI tutors

  • Personalized feedback on your terrible pronunciation

  • Roleplay scenarios in the "Max" subscription tier

Their CEO said generative AI lets them create content for all subjects "close to 100% automatically." Translation: they've turned what used to require armies of linguists and content creators into an AI assembly line.

The beautiful part? Most users have no idea how much AI is under the hood. They just think Duolingo got really good at teaching languages.

This is the pattern across the entire list: the companies winning with AI are the ones hiding it best.

The Sleeper Giants No One's Talking About

Let's talk about OpenRouter at #2.

Never heard of them? Neither had most people. They're an AI infrastructure company that routes API calls between different AI models. Basically, they're the plumbing company for AI. And they're processing an estimated 2 trillion tokens per month.

They're not building sexy consumer apps. They're not getting TechCrunch headlines. They're just quietly moving more AI traffic than almost anyone on Earth.

Then there's Indeed at #3. The job site. They're apparently running so much AI that they're burning through trillions of tokens. What are they doing? Probably:

  • Auto-generating job descriptions

  • Matching resumes to positions

  • Powering chatbots for millions of job seekers

  • Analyzing hiring trends

All the boring, high-volume stuff that actually moves the needle in a business with hundreds of millions of users.

Salesforce comes in at #4, which makes sense—they're integrating AI into literally every product through "Einstein GPT." When you have millions of businesses using your CRM to send billions of emails and customer interactions, the token count adds up fast.

The Developer Tools Takeover

Here's a pattern that jumped out: 5 of the 30 companies (16.7%) are developer tools.

CodeRabbit, Sider AI, Warp.dev, JetBrains, and Cognition's Devin are all burning massive token volumes helping developers write code.

This tells you something important: AI's first real killer use case might not be chatbots or image generation—it's helping developers write more code to build more AI.

It's beautifully recursive. And it means the velocity of software development is about to go parabolic. When the tools developers use to build software are themselves getting 10x better with AI, you get a compounding effect.

JetBrains—the company behind IntelliJ and PyCharm, used by millions of developers—is quietly integrating AI so deeply that they're a top-30 token consumer globally. That's not a side feature. That's a core product transformation.

The Startups Building Truly Wild Sh*t

Not everything on this list is boring enterprise software. Some companies are pushing into genuinely sci-fi territory:

Cognition built Devin, which they call "the first AI software engineer." It's not a code assistant—it's an autonomous agent that can complete entire software engineering tasks on its own. It has a 13.86% success rate on real-world engineering problems, which sounds low until you realize other AI systems are at like 2%.

They're not trying to help developers. They're trying to replace them (or at least, do the boring parts).

Harvey is transforming legal work by training custom models on 10 billion tokens worth of case law. Over 3,500 lawyers at firms like Allen & Overy have tested it. The implications are insane: associates who used to bill 60 hours reviewing discovery documents can now do it in 6.

One magic circle law firm partner told me (off the record) that Harvey's already reducing junior associate headcount needs. The legal industry's about to get Uber'd.

Perplexity is reimagining search entirely. Instead of serving you 10 blue links, it processes massive token volumes to give you actual answers with citations. They're on this list because every search query consumes way more tokens than a traditional Google search—but delivers way more value.

The Trillion-Token Business Model Problem

Let's talk economics for a second.

OpenAI charges $2.50 per million input tokens and $10.00 per million output tokens for GPT-4o (their most popular model). Run the math:

  • 1 trillion tokens at current pricing = $150,000 to $300,000 minimum

  • These companies are spending that annually (at minimum)

  • Some are probably spending multiples of that

This creates a fascinating dynamic. Companies are essentially locked into OpenAI's pricing. If OpenAI decided to double prices tomorrow, every company on this list would either:

  1. Eat the cost increase

  2. Rearchitect their entire product

  3. Go out of business

This isn't like AWS, where you can somewhat easily migrate between cloud providers. These companies have built their core products around GPT-4's specific capabilities. Switching to Claude or Gemini isn't just a config change—it's months of re-engineering.

OpenAI knows this. That's partly why they're handing out trophies. They're saying: "Look how committed these companies are. You should be too."

What The List Reveals About AI's Real Winners

The conventional narrative about AI goes like this: scrappy startups will use AI to disrupt slow-moving incumbents.

This list suggests the opposite is happening: 53.3% of trillion-token users are scaled, established companies.

Why? Because AI at massive scale requires:

  • Massive existing user bases to monetize

  • Cash flow to absorb six-figure AI bills

  • Distribution to actually get the AI features in front of users

  • Institutional knowledge of complex domains

Duolingo can use a trillion tokens because they have 40 million+ daily active users all doing language exercises. A startup trying to compete with Duolingo might have better AI, but they don't have 40 million users to feed it to.

Shopify just announced merchants can sell directly through ChatGPT conversations. That's only possible because Shopify has over 1 million merchants and deep technical integration capabilities. A two-person startup can't pull that off.

The AI revolution isn't being won by AI companies. It's being won by companies with distribution who are adding AI.

The Trillion-Token Threshold as Status Symbol

Here's what's really happening: OpenAI just created a new form of Silicon Valley status.

Burning through a trillion tokens is the new "unicorn valuation" or "IPO" as a signal that you've made it. It means:

  • You're building something people actually use at scale

  • You can afford massive AI infrastructure costs

  • You're betting your product roadmap on AI

Watch what happens over the next 12 months. Every VC pitch deck will have a slide about "token consumption trajectory." Companies will start announcing when they hit token milestones. LinkedIn will be flooded with posts like "Humbled to announce we've processed our billionth token 🙏".

Because that's what this list really is: a leaderboard. And humans are status-seeking creatures who love leaderboards.

The Dirty Secret About Token Optimization

One thing the list doesn't show: how efficiently these companies are using tokens.

Processing a trillion tokens could mean:

  • Scenario A: You have 10 million users doing incredibly valuable AI-powered tasks

  • Scenario B: You have sh*tty prompt engineering and burn 10x more tokens than necessary

Token optimization is becoming a core competency. Companies are hiring "LLM Engineers" whose entire job is to reduce token consumption while maintaining quality. Techniques include:

  • Prompt caching (reusing repeated instructions)

  • Model cascading (using cheap models for easy tasks, expensive models for hard ones)

  • Output length limits

  • Strategic fine-tuning

The companies that figure out token efficiency will have a massive cost advantage. Think about it: if you can deliver the same AI experience with 30% fewer tokens, you just increased your margins by 30%. In a world where everyone's using the same OpenAI models, token efficiency might be the only technical moat.

What This Means for Everyone Else

If you're running a tech company and you're not on this list, you should be asking hard questions:

  1. Why aren't we using AI at scale? Your competitors clearly are.

  2. Are we treating AI as a feature or as infrastructure? The companies on this list rebuilt their products around AI.

  3. Do we have a token budget? If not, you're flying blind.

The gap between companies on this list and everyone else is going to widen. Fast.

Because here's the thing about AI at scale: it gets better with more usage. More usage = more data = better fine-tuning = better product = more usage. It's a flywheel.

Duolingo has now processed so many language learning interactions through GPT-4 that they understand how to prompt it better than almost anyone. That's a real moat. A competitor can use the same model, but they won't have the same prompt engineering expertise or user data.

The Infrastructure Companies Quietly Winning

The most interesting companies on this list aren't the ones you'd expect.

Datadog (#27) is on here because they built monitoring tools for OpenAI usage. As companies scaled up their AI deployments, they needed visibility into token consumption, costs, and performance. Datadog saw the opportunity and built "LLM Observability."

Now they're processing trillions of tokens worth of metadata about other companies' AI usage. That's a beautiful second-order business.

OpenRouter (#2) is winning by being the "AWS for AI models"—routing between GPT-4, Claude, Gemini, and dozens of others. As companies hedge against single-vendor lock-in, infrastructure players that enable multi-model strategies will print money.

The lesson: in every gold rush, sell picks and shovels.

Where This Goes Next

Token consumption is accelerating. OpenAI processes 3 quadrillion tokens annually now. That number will probably 10x in the next few years.

Why? Three trends:

  1. Autonomous agents: Current AI is mostly reactive (you prompt, it responds). Next-gen AI will be proactive, running multi-step workflows that consume way more tokens per task.

  2. Multimodal everything: Text tokens are relatively cheap. Image and video tokens eat way more compute. As AI gets better at images and video, token consumption will skyrocket.

  3. Vertical AI specialization: Companies like Harvey are building domain-specific models that require massive token volumes to train and run. Every industry will have its own "Harvey" equivalent.

The companies on this trillion-token list are early indicators. In 5 years, there will probably be 500+ companies at this scale. In 10 years, burning through a trillion tokens might be as common as hitting $100M ARR.

The Real Lesson: AI Is Already Here

The biggest takeaway from this list isn't about AI's future—it's about AI's present.

These 30 companies aren't running experiments. They're running production systems that millions of people use every day, powered by AI processing trillions of tokens.

When you book a trip, search for a job, learn a language, or get customer support, there's a decent chance you're interacting with a company on this list. And you probably had no idea AI was involved.

That's the point. The AI revolution isn't coming. It already happened. Most people just haven't noticed yet.

The companies on this list figured it out early. They rebuilt their products around AI when it was still risky. They spent six or seven figures on API bills when it wasn't obvious it would pay off.

Now they're getting trophies. And more importantly, they're building moats.

The question for everyone else: how many tokens are you burning?

The OpenRouter surprise: The fact that a AI routing infrastructure company no one's heard of is #2 on this list is wild. They're reportedly processing 2 trillion tokens per month, which means they're actually bigger than Duolingo. The implication: there's a massive B2B AI economy happening that consumers never see. These companies are the Stripe of AI—invisible but essential.

The Notion inclusion: Notion (#14) being on this list is interesting because they've been relatively quiet about AI. No splashy launches. No Super Bowl ads. Just steady integration of AI features across their product. That's probably the smart play—shut up and build.

T-Mobile in the mix: T-Mobile (#21) making the list suggests telecom companies are going hard on AI for customer service. Makes sense when you're handling millions of support tickets. But it also means the next time you chat with "customer support," you're almost certainly talking to AI. The Turing Test is dead; we're all just having conversations with bots now and most of us can't tell.

The Canva dark horse: Canva (#25) quietly using massive AI for design tools is a perfect example of AI being hidden in plain sight. When you use their "Magic Design" features, you're not thinking "wow, cool AI." You're thinking "wow, this is easy." That's the game.

Who's not on the list: Some notable absences: Anthropic (makes sense, they're a competitor), Google Workspace (they use their own models), Microsoft products (same reason). Also missing: most major banks, insurance companies, and Fortune 500 enterprises. That might be because they're building internal models or they're just late to the party. Either way, that's a massive opportunity for OpenAI.

The Harvey legal revolution: Worth digging into the legal tech transformation. Associates at major firms bill $500-800/hour. If Harvey can automate even 30% of that work, it's eliminating billions in billable hours annually. The legal industry has been remarkably resistant to technology disruption for decades. AI might be the thing that finally breaks through. Partners at major firms are realizing they might not need to hire as many associates. That's going to fundamentally reshape law firm economics—and legal education.

The token as currency concept: We're moving toward a future where "token consumption" becomes a key business metric alongside revenue, users, and engagement. Companies will optimize for "token efficiency" the same way they currently optimize for "capital efficiency." CFOs will have "token budgets." This is genuinely a new category of business operations.

Prediction: Within 2 years, someone will build a "token exchange" where companies can trade token credits or hedge against OpenAI price increases. It'll be like cloud computing reserved instances, but for AI. The derivatives market for AI compute is coming.

Reply

or to participate.