- Global startup funding hit a record $189B in February 2026, roughly 30% of 2021’s full-year VC total in a single month.
- About $150B—nearly 80% of the month’s funding—went to just three companies: OpenAI ($110B at an $840B post-money valuation), Anthropic ($30B), and Waymo.
- Unlike the broad-based 2021 boom, this cycle is defined by extreme concentration, with the remaining 20% split across essentially every other startup worldwide.
- Mega-rounds and even billion-dollar seeds (e.g., AMI) signal that capital is clustering around AI model and compute “platform” bets rather than typical software startups.
- The concentration is reshaping the ecosystem by skewing access to talent and compute toward the best-capitalized players and turning AI into an infrastructure-scale capex race.
$189 billion in one month, and most of it went to three companies

February 2026 just set a record nobody expected this soon. Global startup funding hit $189 billion in a single month. That number alone would have been the story in any other year. But the real story is where the money went.
OpenAI raised $110 billion at an $840 billion post-money valuation. Anthropic closed $30 billion. Waymo pulled in a massive round of its own. Between those three, you're looking at roughly $150 billion of the $189 billion total. That's nearly 80% of all global startup funding in February, absorbed by three companies.
The remaining 20% was split across every other startup on the planet.
The numbers that matter
Put $189 billion in context. In all of 2021, during the absolute peak of the ZIRP-era funding frenzy, global VC funding for the entire year was around $621 billion. February 2026 alone hit 30% of that annual total. In one month. And 2021 was considered unsustainable at the time. Investors spent the next two years correcting for what they called irrational exuberance. Apparently, the exuberance is back, just pointed in one direction.
But 2021's money was spread across thousands of companies. Crypto startups, fintech, health tech, climate, SaaS. The distribution was wide, even if some rounds were large. February 2026 is a different animal. The capital is flowing faster than ever, but it's flowing through a much narrower pipe.
OpenAI's $110 billion round deserves its own paragraph. An $840 billion post-money valuation makes it more valuable than every publicly traded company except Apple, Microsoft, Nvidia, Amazon, Alphabet, and Saudi Aramco. This is a private company. It's not generating $840 billion worth of revenue. It's generating $840 billion worth of belief that it will dominate the AI infrastructure layer.
Anthropic's $30 billion is enormous by any historical standard. For perspective, Uber's largest private round was $3.5 billion. Anthropic raised nearly nine times that in a single close. In another month, this would have been the headline. Instead, it's a footnote next to OpenAI's number. That tells you something about the distortion we're living in.
Beyond the big three

The concentration doesn't stop at the top three. Yann LeCun left Meta's AI research lab and launched AMI, which raised a $1 billion seed round at a $3.5 billion valuation in under three months. A billion-dollar seed. That phrase would have been absurd two years ago. Now it's a bullet point in a longer list.
Nscale raised $2 billion specifically for AI data center infrastructure. The physical layer, the actual buildings full of GPUs, is attracting sovereign-wealth-fund-sized checks. This isn't software anymore. This is industrial capital expenditure on a scale that resembles energy or telecom buildouts.
These are not small companies being funded by optimistic angels. These are infrastructure plays backed by the largest pools of capital on earth, and they're all betting on the same thesis: whoever controls the compute and the models controls the next platform.
What concentration actually means for the ecosystem
When 80% of capital flows to three companies, the mechanics of the startup ecosystem change. Not in some abstract, theoretical way. In concrete, practical ways that affect every founder trying to raise a Series A right now.
First, talent. OpenAI, Anthropic, and Google DeepMind (Waymo's parent has deep pockets too) can offer compensation packages that no Series A startup can match. Stock in a company valued at $840 billion, even at the employee option level, is a different proposition than stock in a company valued at $50 million. The talent war was already brutal. Now it's asymmetric.
Second, compute. The companies with the most capital are buying the most GPUs. Nscale raising $2 billion for data centers means those data centers will serve the highest bidders. Smaller AI startups are already paying premium rates for compute, and that premium is rising as demand outstrips supply. If your AI startup needs significant training runs, you're competing for the same hardware that OpenAI is pre-purchasing in bulk.
Third, distribution. OpenAI has ChatGPT with hundreds of millions of users. Anthropic has deep enterprise partnerships with Amazon and Google. A new AI startup building a competing model faces a distribution problem that money alone can't solve. You don't just need a better model. You need a way to get it in front of people, and the incumbents already own the channels.
The oligopoly question
Is this an oligopoly forming? The honest answer is: it already formed. We just didn't call it that because the companies were still "startups."
When three private companies absorb 80% of global funding in their sector, when they control the majority of frontier AI talent, when they're pre-purchasing the compute supply chain, and when they own the primary distribution channels, that's a concentrated market by any reasonable definition.
The counterargument is that open source keeps the market competitive. Meta's Llama models, Mistral, various Chinese labs, and the open-weight ecosystem create real alternatives to closed-model providers. That's true, and it matters. But open-source models still need compute to run, and the compute layer is consolidating just as fast as the model layer.
There's also the argument that AI is still early, and new entrants will emerge. Maybe. But the capital required to compete at the frontier keeps increasing. Training a state-of-the-art model in 2024 cost tens of millions. In 2025, hundreds of millions. By 2027, if scaling laws hold, we're talking billions in training costs alone. Each generation of models raises the floor for new entrants.
Yann LeCun's AMI is an interesting test case. He has the name recognition, the research credentials, and the investor interest to raise a billion dollars before shipping a product. Most founders don't have that. If you're a talented AI researcher with a novel architecture idea, your realistic options are: join one of the big three, or build in the application layer on top of their models. Building a competing foundation model company from scratch is no longer a viable path for most teams.
Where the opportunity actually lives

None of this means the AI startup ecosystem is dead. It means the game has changed, and founders who recognize the new rules can still build valuable companies.
The application layer is wide open. Companies like Harvey (legal AI), Glean (enterprise search), and Sierra (customer service) are building on top of foundation models and growing fast. They don't need to train their own models. They need domain expertise, distribution in specific verticals, and the ability to ship product that solves concrete problems.
Vertical AI, meaning AI applied to specific industries with proprietary data advantages, is where smaller teams can still win. Healthcare, manufacturing, financial services, logistics. These sectors have messy data, complex workflows, and incumbents who move slowly. A team of five people who understand hospital billing better than anyone at OpenAI can build a real business.
The infrastructure layer below the hyperscalers also has room. Tooling for fine-tuning, evaluation, monitoring, security, compliance. Every company deploying AI models needs this stuff, and the big model providers aren't building it. They're focused on the model. That leaves a genuine gap for companies like Weights & Biases, Braintrust, and dozens of others building the operational layer.
There's also a geographic angle. AI regulation varies by jurisdiction, and companies that understand European compliance, or healthcare data rules, or financial services requirements in specific markets have advantages that a San Francisco model provider simply won't prioritize. Local context is a moat that scales poorly for large companies, which makes it a good place for smaller ones.
What February told us
February 2026 wasn't an anomaly. It was a signal. The AI industry is consolidating around a small number of companies with access to capital, talent, and compute at a scale that creates structural advantages. The venture capital market isn't broken. It's doing exactly what it's designed to do: concentrating bets on perceived winners.
For founders, the practical takeaway is straightforward. If you're building a foundation model company, you need to be Yann LeCun or have a genuinely differentiated technical approach and access to billions in capital. If you're building an AI application company, the opportunity is real, but your moat comes from domain expertise and distribution, not from the model itself.
The $189 billion isn't coming back down. The concentration isn't reversing. The question isn't whether AI venture funding will stay high. It's whether the remaining 20% is enough to sustain a healthy ecosystem around the giants. History suggests it can, but only for companies that stop trying to compete with the oligopoly and start building on top of it.
Marco Kotrotsos writes about practical AI implementation at gloss.run and acdigest.substack.com.
If you’re building or investing outside the AI giants, plan for a market where fundraising, hiring, and GPU access are all harder and more expensive than the headline funding numbers suggest. Focus your story on clear differentiation (data, distribution, regulated niches, or workflow ownership) rather than “we’ll build a better model,” and lock in compute and cloud commitments early where possible. Watch pricing and availability of GPUs, data-center buildouts, and compensation benchmarks at the big labs—those will increasingly set the operating conditions for everyone else.