The AI Feature Race Is Creating the Next Generation of Bloatware
Modern CMS platforms are competing to add the most AI. That race is heading toward the same bloatware problem you're trying to leave behind.

Anders Granberg
Co-Founder
The CMS platforms competing for your attention right now are building toward the same problem you're trying to leave behind. They're just doing it with AI instead of plugins.
If you're evaluating alternatives, or already on your way out of WordPress, Optimizely, Sitecore, or another legacy platform, that probably isn't what you expected to hear. The migration away from legacy CMS is accelerating across the board. WordPress has lost nearly five percentage points of market share since 2022. Older platforms like Joomla and Drupal are in freefall. And enterprise teams on Optimizely and Sitecore are facing aging runtimes and mounting technical debt. The reasons vary, but the direction is the same.
But here's the thing nobody's telling you: the most important decision isn't whether to switch. It's making sure you don't switch to the next generation's version of the same mess.
The new CMS landscape looks impressive. Look closer.
Open any modern CMS website and the pitch writes itself. AI-powered content generation. Intelligent workflows. Automated tagging. Predictive personalization. Smart search. The feature lists are long and the demos are polished.
Every platform is racing to out-AI the competition. And on a comparison spreadsheet, more features looks like more value.
But the market is already showing cracks. According to recent industry data, 42% of companies abandoned most of their AI initiatives in 2025, up from 17% the year before. On average, organizations scrapped nearly half of their AI proof-of-concepts before they ever reached production¹. The features that looked great in a pitch deck didn't survive contact with real workflows.
But if you've spent any time inside a CMS that promised to make your life easier and ended up doing the opposite, that logic should feel familiar. WordPress didn't become bloated overnight. It happened one plugin, one feature, one "just add this" at a time. Each addition made sense in isolation. Together, they created the thing you're now trying to escape.
The question is whether the platforms you're evaluating are repeating the same pattern, just with better marketing and a different technology stack.
How bloatware happens (again)
There's a well-documented dynamic in software development. When products are built around technological capability rather than user needs, complexity grows faster than usefulness. Every new feature adds cognitive load: another menu, another option, another decision for the person doing the actual work.
In the current CMS market, the pressure to ship AI features is enormous. Investors want to see it. Marketing needs the talking points. Sales needs the checkboxes. So AI capabilities get built, exposed in the interface, and pushed to market, often before anyone has tested whether they actually make a content editor's day easier or harder.
The result is a new generation of platforms that can do more things than ever, but where the core experience of creating, managing, and publishing content has become cluttered rather than streamlined.
This isn't a hypothetical. Industry voices are already raising the alarm. The headless CMS category, home to many of the "modern" alternatives you're probably evaluating, is showing visible signs of feature bloat. Platforms backed by significant investment end up in a feature arms race where every competitor's capability becomes a must-have, regardless of whether users asked for it or will ever use it.
The irony is hard to miss. People leave legacy platforms because they've become unwieldy. And the alternatives competing for their attention are building toward the same destination. Just faster, and with AI as the accelerant instead of plugins.
And it's not just a design problem. It's measurable. A 2025 randomized controlled trial studied experienced developers using AI tools in their actual codebases. The developers estimated AI saved them about 20% of their time. The measured result? AI made them 19% slower². They didn't even notice. That study focused on software development, but the principle applies wherever AI is inserted into skilled work: if the tool isn't designed around the actual workflow, it adds friction that the user may never consciously register.
What AI in a CMS should actually do
None of this means AI in content management is a bad idea. It's not. AI can genuinely transform how content teams work, but only when it's pointed at the right problems.
The difference is subtle but critical:
AI that adds buttons gives you more options. AI that removes steps gives you more time.
A content editor doesn't wake up wanting AI-powered content generation, automated taxonomy suggestions, and predictive performance analytics as three separate features with three separate interfaces. They wake up wanting to publish a campaign page without waiting for a developer, or to translate content without a three-day turnaround, or to find that asset someone uploaded six months ago without searching through four folders.
Remember: a big part of why you're switching CMS in the first place is the feeling of not owning your own site. Of being dependent on someone technical for every change. Good AI should dissolve that dependency further. It should make your team capable of doing more on their own. Bad AI recreates the same problem in a new form: instead of waiting for a developer, your editor is now wrestling with an AI tool they don't fully understand, trying to figure out which of six "generate" options to pick, or spending time correcting output that wasn't quite right.
Here's what good AI actually looks like in practice: an editor writes a page and the CMS quietly analyzes the content in real time, suggesting metadata, flagging keyword gaps, and surfacing SEO insights as the text takes shape. No separate tool. No extra step. The editor stays in their flow and the optimization happens alongside the writing, not after it. That's AI that removed steps. The editor may not even think of it as AI, and that's the point.
When AI solves real problems, invisibly, inside the workflow that already exists, it's genuinely valuable. When AI surfaces as a row of new buttons and a submenu of options the editor has to learn, evaluate, and decide whether to trust, it's just another layer between your team and their work.
Microsoft's 2025 New Future of Work report puts it plainly: people won't adopt AI tools that force them to change how they work. They want tools flexible enough to fit their existing patterns³. And the research goes further: the tools that succeed are the ones where users were involved in the design process, not just handed a finished product. That principle applies directly to CMS: a platform where AI was shaped by editorial workflows will feel different from one where AI was shaped by an engineering roadmap.
The distinction isn't about having AI or not having AI. It's about whether the AI was designed starting from the editor's daily frustration, or starting from a feature comparison spreadsheet.
What to look for, and what to ask
If you're evaluating CMS platforms right now, the most useful thing you can do is change the questions you ask. Feature lists won't protect you from choosing the next generation's bloatware. But the right questions might.
Instead of "What AI features do you have?" Ask: Which specific editorial workflows does your AI simplify, and how many steps does it remove?
A platform that can answer this concretely, like "editors used to do X in five steps, now it's two," has thought about the problem before building the solution. A platform that answers with a feature list has done it the other way around.
Instead of "Can your AI generate content?" Ask: How does your AI keep the editor in control?
Content generation is easy to demo, hard to use well. What matters is whether the editor feels empowered or replaced, and whether the AI fits into how they already work, rather than demanding a new workflow.
Instead of "How many integrations do you support?" Ask: What have you deliberately chosen not to build?
This question tends to separate the thoughtful from the bloated. A platform with a clear point of view about what it won't do has made deliberate design decisions. A platform that's trying to be everything is on the same trajectory WordPress was. It just hasn't arrived yet.
Instead of "What's on your roadmap?" Ask: How do you decide what makes it into the product?
The answer reveals whether the development process is driven by user research and measured outcomes, or by competitor feature lists and investor expectations. Both types of companies ship new things. Only one type consistently ships things that make your team's life easier.
These questions help you filter before you even get to the demo. And when you do sit down for the demo, that's where most CMS evaluations fall apart entirely. Our Head of Product Marcus Lindblom wrote about the five questions that cut through any CMS demo and reveal what the editing experience will actually feel like on a Tuesday afternoon.
Don't switch to the next generation's problem
You've earned the right to be picky. If your current CMS has taught you anything, it's that a long feature list and a great sales demo don't guarantee a good daily experience.
The platforms that will actually serve you well in five years aren't the ones adding the most AI today. They're the ones asking the most honest questions about where AI belongs and where it doesn't. They're the ones that started with the editor's frustration and worked backward to a solution, not the ones that started with a technology and went looking for a place to put it.
You're about to make a decision that your content team will live with every day. Choose the platform that removed steps, not the one that added buttons. Choose the one that gives your team ownership of their own site, not one that's quietly building the next thing they'll want to escape.
References
¹ Fullview, "200+ AI Statistics & Trends for 2025" fullview.io/blog/ai-statistics
² Wijk et al., "Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity" (2025) arxiv.org/abs/2507.09089
³ Microsoft, "New Future of Work Report 2025" microsoft.com/en-us/research
Header image: AI Accountability by Champ Panupon for Google DeepMind. Free to use via Pexels.