AI-powered enshittification
Summary
AI is a now a significant contributor to the broader trend of enshittification. The market is full of AI-powered products, features and code that are costlier to use and run and often counterproductive for the end users.
There used to be a predictable lifecycle for digital products. A startup would launch a tool that solved a genuine problem. It’d be lean, focused, and it worked well. The user experience was clean because the company needed to attract users to survive.
Cory Doctorow calls the eventual decline of these platforms "enshittification." Usually, this happens when a platform shifts from creating value for users to extracting value for shareholders. We see this phenomenon in ad-cluttered feeds and paid tiers for basic features.
But recently, I’ve noticed a new, faster accelerator for this decline. We are entering the era of AI-powered enshittification.
The funding trap
It starts with the money. If you’re a techpreneur in the current economic climate, you can’t get funding without AI in your product. It doesn't matter if your product is a simple to-do list, a calendar, or a note-taking app. If you don’t have "Generative AI" in your pitch deck, venture capitalists aren't interested.
This funding constraint creates a perverse incentive. Founders and product managers are no longer solving user problems. They are solving investor problems. Investors are solving perception problems. Everyone must be in on the AI bandwagon, whether it makes sense or not.
Since entrepreneurs need capital to keep the lights on, they slap AI onto problems that don't need AI. And some startups are just fraudulent. Sorry, I don’t want to roast existing products here, but let’s look back at the case of Builder.AI. They claimed to use AI to speed up app development. Great value proposition, eh? They even had funding from Microsoft! But in May 2025, the company filed for bankruptcy, after burning – hold your breath – $445 million. Amongst its many parlour tricks was the trivial act of doing much of its purported “AI-powered” development, not through some new tech, but through hundreds of offshore developers! It’s what we call, you know, outsourcing.
Take a look around you. It’s the age of AI-powered everything – from mattresses to video editors. We’re killing mosquitoes with bazookas even when a rolled-up newspaper could have done the trick.
The gold rush distracts from the core
I’ve written before about the AI gold rush. When a gold rush is on, the fear of missing out drives irrational behaviour.
In product development, attention is a zero-sum game. If your engineering team is scrambling to integrate a mediocre chatbot or a "magic rewrite" button, they aren't fixing bugs. They aren't improving performance. They aren't refining the core interactions that made people love your product in the first place. The core product begins to rot.
Worse still, SaaS corporations have one more lever to squeeze users – their users’ data and creations. Last year, Adobe announced new terms of service for Creative Cloud, saying, “We may access your content through both automated and manual methods, such as for content review”. The idea, many users feared, was to train Adobe’s Firefly model on the content Adobe users created, so that Adobe could charge those users extra fees for its eventual “Generative credits” system. Adobe later clarified its terms, but creators remain sceptical of Adobe’s intentions.
Shouldn’t Adobe spend more time improving the user experience of Photoshop or Premiere Pro to make them more approachable for novice users? Well, sure, they can, but a lot of that attention is now going to generative AI instead.
Shipping probability as a feature
The most pernicious aspect of the gold rush is the technology itself. Traditional software is deterministic. If I click a button to save a file, it saves the file. If I ask a calculator for 2 + 2, it gives me 4. Every. Single. Time.
Generative AI is probabilistic. It creates an output based on patterns, not facts. As I’ve noted in my thoughts on AI as a perpetual beta, these features are unreliable by design.
When you shove probabilistic features into a deterministic workflow, you’re gaslighting the user.
If a throw of the AI dice results in a decent output, well, the software is “magical”.
If the output is useless, though, it’s the user’s fault. They must prompt better, right?
Early in my use of computers, I was an active participant in Linux user groups. In those days, when someone struggled with the UNIX command line, tech-savvy users would often ridicule them with the jibe, PEBKAC – “problem exists between keyboard and computer”. Years of UX innovation taught us that PEBKAC is a failure of design, not the user. Instead of blaming the user, the focus should be on improving the design. I don’t see those design improvements, though. Every new iteration of models brings its own prompting “best practices”. All of a sudden, the PEBKAC jibe is back in vogue.
The shiny, new toy syndrome
When you have a new hammer, everything looks like a nail. Generative AI isn’t just infiltrating user interfaces; it’s also infiltrating codebases. And might I add, it’s a needless infiltration. My ex-colleague Pramida Tumma recently wrote about the trend of using AI services in code, as a substitute for traditional libraries and deterministic coding patterns.
Pramida described two scenarios and contrasted traditional programming with the use of API calls to generative AI services. Traditional programs outshine AI programs by orders of magnitude on both cost and performance.
| Costs | Performance | |||
|---|---|---|---|---|
| Situation | Traditional program | AI program | Traditional program | AI program |
| JSON to XML conversion | $18 | $862 | <1 milliseconds | 2-3 seconds |
| Image scaling | $8.40 | $36,500 | 2 seconds | 30-60 seconds |
Pramida isn’t discouraging developers from using AI services in their code, but shining a light on the costly misadventures of using AI where it's unnecessary.
Deluxe enshittification
Funding constraints, the AI gold rush, the probabilistic nature of generative AI, and our tendency to prefer shiny new toys lead to "deluxe enshittification."
Standard enshittification is annoying – it’s more ads and higher prices. AI-powered enshittification is worse because it further degrades the functional utility of our tools.
Are we at the peak of the AI bubble yet? I don’t know. But the market is flooded with software full of features that are expensive to run, harmful to the climate, and often actively hostile to the user's intent, all to satisfy a funding mandate.
I wish product managers and technologists could resist this wave. We can’t. I don’t have a positive takeaway for you. I’m sorry. Big capital has far more power than all of tech labour. We’re firmly in the Enshittocene. Things are getting worse faster than we can say “AI slop”. Maybe we should invent a new term now. “AI turd”, anyone? Or how about “probabilistic pollution”? You tell me!