There's a pattern emerging in enterprise software right now that product marketers should be paying very close attention to — not because of what it says about AI, but because of what it says about data.
The biggest application cloud vendors are on acquisition sprees. Not for AI models — those are increasingly commoditized. Not for flashy consumer features. They're buying plumbing. Data integration. Metadata management. Workflow connectors. Document intelligence. The unsexy stuff that actually makes agentic AI work.
And if you read between the lines of these deals, every single one of them is a confession.
The Shopping Spree
Let's start with the receipts.
Salesforce has made ten acquisitions in roughly six months — Informatica ($8B), Momentum, Cimulate, Qualified, Spindle AI, Doti AI, Convergence, Regrello, and more. The pattern is impossible to miss.
Agentforce launched in late 2024 with considerable fanfare, promising AI agents that could actually do work rather than just assist with it. The reality was messier. Early adopters hit data quality problems, inconsistent agent behavior, and a pricing model that confused more customers than it converted.
So Salesforce went shopping — and the shopping list reads like a checklist of Agentforce's known weaknesses: patchy enterprise data, limited process visibility, narrow search, gaps at the top of the funnel, and agents with no access to what customers actually said on calls.
Now look at Workday. A company that made just 22 acquisitions in its entire 20-year history suddenly completed five in 19 months — Sana ($1.1B, its largest ever), Paradox, Flowise, Pipedream, HiredScore. The thesis? Workday wants to become the "agent system of record" — but first it needs to connect its HR and finance data to the 3,000+ business applications where work actually happens.
Then in February 2026, the company fired its CEO without cause, with the stock down 40% over 12 months. The co-founder returned, signaling a pivot from acquisition-fueled growth back to organic R&D.
Both stories share the same subtext: the AI model layer is not the bottleneck. The data layer is.
The Confession
Here's what these acquisition sprees are really saying:
"We built the agent. We trained the model. We designed the UX. And then we discovered that none of it matters if the data underneath is fragmented, ungoverned, and stitched together from six different acquisitions."
This is the part the keynote demos skip. An AI agent is only as intelligent as the data it reasons over. And when that data is assembled from bolt-on acquisitions — Informatica for governance here, Momentum for call transcription there, Pipedream for API connectors over there — the agent isn't reasoning. It's guessing. Probabilistically inferring context across systems that were never designed to talk to each other.
Marc Benioff said it himself when the Informatica deal closed: "Without clean, connected, trusted data there is no intelligence — only hallucination." He's right. But it raises a question: if you have to spend $8 billion to get the data layer right after you've already launched your agent platform, what does that tell you about the architecture you started with?
The Deterministic Advantage
This is where the story gets interesting for anyone watching the enterprise AI landscape.
There's a category of vendor that doesn't need to go on an acquisition spree to build a data foundation — because they've been sitting on one for decades. Systems of record that hold the actual transactional truth of the enterprise: the real inventory levels, the real margin data, the real workforce capacity, the real financial close.
SAP is the clearest example. The SAP data fabric — spanning S/4HANA, BW/4HANA, SAP Datasphere, and the Business Data Cloud — isn't an assembled Frankenstein of acquired parts. It's a native data layer that connects the financial ledger, supply chain, HR, CX, and planning data across the enterprise.
When an AI agent built on this foundation asks "what's the status of this account?" — it's not inferring from sentiment signals and metadata stitched across six acquisitions. It's reading deterministic, system-of-record data. The actual state of the business.
That's the difference between an agent that says "this account looks like it might churn based on engagement signals" and one that says "this account's consumption dropped 40% last quarter, their contract renewal is in 90 days, their primary champion just left the company, and here's the play."
Probabilistic vs. deterministic. One guesses. The other knows.
The Hype Cycle Implication
This brings us to the hype cycle — and specifically, to the question of where AI in enterprise marketing actually sits right now.
A February 2026 NBER study found that 90% of firms reported no impact from AI on workplace productivity. Meanwhile, Gartner estimates global AI spend will hit $2.5 trillion this year. That's a staggering gap between investment and impact — and it's the textbook definition of the Trough of Disillusionment.
But here's the nuance that matters for product marketers: the trough isn't about the technology failing. It's about the data foundation being absent.
Companies that deployed AI copilots on top of fragmented, multi-vendor data stacks are the ones reporting no impact. They bought the agent, skipped the plumbing, and got hallucinations instead of intelligence.
The application cloud vendors scrambling to acquire data management companies right now are essentially admitting this in real time.
The Slope of Enlightenment — when it comes — won't be driven by better models. It'll be driven by better data architecture. Specifically:
- Unified data fabrics that connect operational, financial, and experiential data natively — not through bolt-on acquisitions.
- Deterministic grounding that gives AI agents access to system-of-record truth, not probabilistic inference across metadata layers.
- Composable planning and analytics (the xP&A thesis) where the same trusted data powers both the AI agent and the human decision-maker.
What This Means for Enterprise Leaders
If you're evaluating AI platforms — or building a GTM strategy around one — the takeaway is both strategic and tactical:
Strategically, stop evaluating vendors on the AI model. Everyone has access to frontier models now. The differentiation is in the data. The winning enterprise AI story for the next 18 months isn't "our AI is smarter" — it's "our AI is grounded in the actual truth of your business." That's the moat. That's the story.
Tactically, watch the acquisition patterns. Every tuck-in deal is a confession about a gap in the platform. When a vendor buys a data governance company, they're telling you their agents were hallucinating. When they buy a workflow connector company, they're telling you their data was siloed. These are competitive intelligence signals hiding in plain sight — and they should inform every enterprise software evaluation happening right now.
The AI hype cycle is compressing faster than any prior technology wave — social, mobile, cloud all took years to move through the trough. This one is happening in months.
The vendors who started with the data layer intact — who didn't need a $10 billion acquisition spree to teach their agents where the truth lives — are the ones who'll come out the other side first.
Everyone else is still shopping.