AI and the scarcity loop

Banner image of a robot watching a man gambling

Summary

The unpredictable nature of generative AI can often feel as addictive as gambling. Addictive services often get enshittified, and can also be candidates for a classic bait-and-switch.

I recently read Michael Easter’s excellent book, Scarcity Brain. The scarcity loop, as Easter describes it, is the perfect way to develop an addiction. It has three parts:

  • You must see an opportunity to gain something you value. For example, when you gamble with a slot machine, you stand to win money.

  • The gamble should lead to unpredictable rewards. You may win nothing, or you could win big or get a range of outcomes in the middle. It isn’t fun if it’s predictable, though. Unpredictability is key. 

  • And finally, the behaviour should favour quick repeatability. If you don’t win the first time, play again. If you win, play again to see if you can win bigger. It takes little effort to try your luck.

 
A diagram showing Michael Easter's scarcity loop

The typical scarcity loop that drives addiction

 

Almost every addiction, be it gambling, alcoholism, drug abuse or social media, fits this narrative. Tech companies have long exploited this loop to develop more addictive applications that help them offer their products for free while monetising users’ attention

Many technologies start useful — Cal Newport calls them “additive” technologies. For example, Facebook promised to connect us to our friends. Over time, though, these technologies change their characteristics and become “enshittified”; i.e they become worse, because their goals change from being useful to being profitable. In the case of Facebook, their goal is to sell ads, and so we see less from our friends and more outrage, fake news and sensationalism on the platform. The originally additive technology becomes an addictive and “extractive” technology. In most cases, they exploit Easter’s scarcity loop.

I’ve been wondering if generative AI also follows the scarcity loop. Most of these tools promise valuable outputs. The reward is unpredictable due to the probabilistic nature of these tools. And hitting the “generate” button is easy. We can do it again, and again, and again. 

Until, of course, the product gets enshittified and software companies employ their classic bait-and-switch manoeuvres. For example, Adobe recently introduced the concept of “premium” features that use extra generative credits. It’s a sneaky way to get customers to pay more. Let me tell you about my short experience.

I experimented with Adobe’s partner models using one of my photographs. You’ll notice the lion below is closing his right eye. Could Nano Banana or Flux Kontext Pro open the eye? There’s a valuable opportunity, the results are likely to be unpredictable, and I can keep trying. Right?

Photograph of a lion

Ideally, this lion would have opened his right eye for me

Errm, no! I tried two generations, and Flux gave me human eyes instead of a lion's eye. When I tried again, it wanted to sell me an upgrade. 

Diagram showing how Adobe upsells generative AI features

Adobe’s generative AI upsell

It’s a pretty awful bait-and-switch, and thankfully, I’d rather go back and photograph a lion properly than muck around with its eyes in Photoshop. Adobe promised that if you buy a Creative Cloud subscription, you’ll get all new features automatically for the duration of the subscription. It was the same deal with Office 365, Salesforce, or Notion. Well, that contract changed several months back. 

  • Traditional software improvements have almost stopped. 

  • Almost all improvements are AI improvements. 

  • These AI improvements were free for a while.

  • Some of us developed an AI addiction.

  • Are we in a bait-and-switch wave?

I don’t have the answers, but I sure do remain curious. What about you?

Next
Next

The trouble with AI overconfidence