Gloss Key Takeaways
  1. Block is reportedly cutting about 40% of its workforce while framing the move as an AI-driven transformation, a narrative that markets rewarded.
  2. A Darden Business School analysis argues AI is often used as a convenient scapegoat, masking preexisting business pressures like margin compression, regulation, and weaker consumer demand.
  3. Across the industry, layoffs are increasingly justified with AI language, even when the AI systems purportedly replacing workers are unclear, internal-only, or not yet in production.
  4. The recurring pattern is layoffs plus “efficiency/AI” messaging that boosts stock sentiment while sidestepping hard questions about whether the technology actually exists and works at scale.
  5. If AI were truly replacing labor at these levels, productivity metrics (revenue per employee, cycle times, error rates) should visibly improve, but the article suggests those signals are often missing or unconvincing.

Block Layoffs

Block, the fintech company formerly known as Square, is reportedly cutting nearly half its workforce. The company frames this as an AI-driven transformation, a strategic repositioning for a future where AI handles work that humans used to do. Wall Street rewarded the announcement. The stock moved up.

Meanwhile, Darden Business School published an analysis asking the question nobody on the earnings call wanted to hear: is AI the strategy, or the scapegoat?

The answer matters beyond Block. In 2026, 55% of hiring managers surveyed expect layoffs at their companies, and 44% say AI will be the primary driver. Oracle is planning to cut 20,000 to 30,000 employees to fund AI infrastructure. Amazon reduced its corporate workforce by 30,000 across two rounds. Tech layoffs in March alone reached 45,000, with over 9,200 explicitly attributed to AI and automation.

AI has become the most socially acceptable justification for mass layoffs since "restructuring for shareholder value."

The pattern

Every wave of layoffs follows the same script. The company announces cuts. The press release mentions "efficiency," "automation," and "AI." Analysts nod. The stock ticks up. Nobody asks whether the AI systems replacing those workers actually exist yet, or whether they work.

Company Layoffs (2026) Official Reason AI Systems in Production
Block ~40% of workforce AI transformation Unclear, no public demos
Oracle 20,000-30,000 Fund AI infrastructure Infrastructure, not products
Amazon (corporate) 30,000 across two rounds Operational efficiency Some, mainly internal tools
Various (March 2026 total) 45,000+ Mixed, 9,200+ cite AI Varies widely

The gap between "we're cutting jobs because of AI" and "we have AI systems that do those jobs" is where the real story lives. In some cases, the AI replacement is genuine. Automated customer service, AI-generated code review, algorithmic content moderation. In many cases, the AI is aspirational. The company plans to have AI do the work. Someday. After they build it. With the money they saved from the layoffs.

Block Layoffs

The scapegoat thesis

Darden's analysis of Block's cuts raises uncomfortable questions. Block's core business, payment processing, is facing margin pressure from competition, regulatory scrutiny, and a cooling consumer spending environment. The 40% cut addresses a cost problem that existed before AI entered the conversation.

Framing a cost-cutting exercise as an AI strategy accomplishes several things simultaneously:

It transforms a defensive move into an offensive narrative. "We're cutting costs because revenue is under pressure" is a bad story. "We're restructuring around AI to capture the next wave of growth" is a good story. Same outcome, different framing, different stock reaction.

It shifts the conversation from management accountability to technological inevitability. If AI made the jobs obsolete, nobody is at fault. It's progress. If management overhired during a boom and now needs to correct, someone is accountable for the misjudgment. The AI narrative removes human decision-making from the frame.

It provides cover for cuts that would otherwise raise governance questions. Cutting 40% of a workforce is extraordinary. Under normal circumstances, it would prompt questions about whether the company's leadership failed at planning, execution, or both. When AI is the reason, those questions get muted.

The numbers tell a different story

If AI were genuinely replacing worker output at the scale these layoffs suggest, you'd expect to see it in productivity metrics. More output per remaining employee. Faster cycle times. Lower error rates. In most cases, those metrics either don't exist or don't support the narrative.

What You'd Expect What's Actually Happening
Revenue per employee increasing sharply Flat or modest increase at most companies
AI tools handling specific workflows end-to-end AI assists humans on fragments of workflows
Fewer employees, same or better output Fewer employees, lower output, remaining staff stretched thin
Clear documentation of AI replacing specific roles Vague references to "AI-driven efficiency"
Hiring AI specialists to replace generalists Hiring freezes across all categories

The pattern across most companies announcing "AI-driven" layoffs is not that AI has replaced the work. It's that the work has been redistributed to remaining employees, who are now expected to use AI tools to handle the increased load. That's a different thing entirely from AI automation, and it's a strategy that tends to burn out the people who survived the cut.

Why this matters

The conflation of AI capability with layoff justification has three corrosive effects.

First, it poisons the well for legitimate AI adoption. When employees hear "we're implementing AI," they hear "we're planning layoffs." That creates resistance to AI tools that could genuinely help people do their jobs better without eliminating those jobs. The companies using AI as a layoff excuse are making it harder for every other company to implement AI constructively.

Second, it distorts the AI investment landscape. When Wall Street rewards AI-framed layoffs with stock bumps, it creates an incentive for more companies to frame cost cuts as AI strategy. The market signal stops being "invest in AI" and becomes "mention AI when you cut headcount." Capital flows toward the narrative, not the technology.

Third, it obscures the real impact of AI on work. The actual effect of AI on most knowledge work is task-level transformation, not job elimination. Individual tasks within roles get automated or accelerated. The role changes shape. That's a nuanced story that requires careful management. The "AI replaced them" narrative replaces nuance with a convenient fiction.

The uncomfortable question

When a company cuts 40% of its workforce and attributes it to AI, the question to ask is: show me the AI. Not the roadmap. Not the strategy deck. Not the pilot program. Show me the deployed, operational AI system that does the work those people used to do.

In most cases, the answer is silence, or a pivot to talking about future capabilities.

AI is transforming work. That's real. But "AI is transforming work" and "AI justifies cutting half your workforce this quarter" are different claims, and the second one requires evidence that most companies making it cannot provide.

The technology will eventually catch up to the narrative. When it does, the companies that invested in genuine AI transformation will be differentiated from the ones that used AI as a press-release-friendly synonym for downsizing. For now, the gap between those two groups is wider than most investors, and most displaced workers, realize.

Gloss What This Means For You

When you hear a company cite AI as the reason for layoffs, look past the headline and ask what specific systems are live, what work they actually replace, and whether there are measurable productivity gains to prove it. Pay attention to whether the “AI strategy” is really a funding shift toward infrastructure or a cover for older problems like margin pressure and overhiring. If you’re an employee or job seeker, treat AI-driven restructuring as a risk signal and prioritize roles tied to clearly deployed AI workflows, not vague future promises.