Sean Taylor

Satisficer's Regret

Eric Seufert:

“Black box” ad optimization and satisficer’s regret

I propose the term satisficer’s remorse to describe the regret felt by an advertiser when an AI-enabled optimization platform delivers profitable results that meet its ROAS goals but theoretically could have been even more efficient.

An advertiser can feel this remorse even while acknowledging and accepting that they are better off having deployed their budget on that channel — and that they would be worse off had they not. It’s not necessarily irrational, either: it’s resentment over knowing that, by design, the platform delivered, while adequate, worse results than it could have.

I think this term is applicable to small game developers who begrudge paying platform taxes to Apple and Google. The assumption always seems to be that if they could just swap in that DTC solution, they'd achieve the same ROI on their own. Beyond a few big guns, I've yet to see this thesis proven in the wild.

Or, as Ben Thompson said in his recent interview with Seufert:

This is the key thing. The problem with this mindset, which again, is totally rational, I understand why people feel this way, is they’re comparing the ROAS they asked for and was delivered to them to the theoretical ROAS that the platform could have given them, when they need to be comparing it to the ROAS they could achieve on their own, which is actually lower. And so yes, maybe Facebook could deliver you 140%, they only gave you 120% because that’s what you asked for. Realistically, you on your own are only achieving 80% and so they’re making the wrong comparison.

It’s funny, it’s like the publisher thing with Google, the open web, be careful what you wish for. There’s this comparison that happens to a theoretical ideal that is not aligned to the reality of the situation, which is you cannot and will not fill your inventory on your own, and be careful what your comparison set is because if you’re comparing it to a theoretical ideal, then you’re pining after something that doesn’t exist and you’re going to get reality, which is going to be much worse.