Many companies are arguing that AI copytesting is the future. It's cheaper, faster, and therefore more actionable. I've long argued that copy testing with real humans should be affordable and fast, but the real problem with AI copy testing is what we data scientists call “reversion to mean.”
What this comes down to is that statistical models (all AIs) typically pick the things that work and stick with them. That's why you keep on seeing the same ads on YouTube and why so much AI-generated content follows the same structure. There is some evidence that AI is more extreme than your average human, but that has more to do with the training data than anything fundamental in the model, and of course, reverting to something more extreme is no better than reverting to the mean.
If we think about what this means for copy testing, we find that the AI shepherds all content down well-worn paths of storytelling and discriminates against innovation.
If you haven’t already seen it, Fallout on Amazon is the latest rule-breaking blockbuster that wouldn’t get off the starting line if an AI judges it. There’s one scene in particular where they not only let the dog die but have one of the main characters brutally murder it. This goes against everything we know about the rule book but works in context. If you don’t believe me, watch the show and see how they pull it off. It’s compelling television.
Put this through an AI copy test, and it'll tell you to cut the dog’s death. All the evidence suggests it won't work, but it takes human brilliance to realize that killing the dog is right in this context. Just because it's never worked before, doesn’t mean it doesn’t work here, but no AI will understand that.
We can never swap human opinions for AI ones. A crucial part of the human condition is our desire to try something different and new. AI is trained on what's happened in the past and doesn't have dogged determination to iterate the same lousy idea a hundred times until it finally gets it right.
We should focus on using AI to augment our thinking, not replace it. Instead of using AI to replace humans in a copy test, we should use AI to accelerate the process to make it cheaper and more efficient.
We need more humans looking at our content and figuring out how to innovate it, not fewer.
If you haven’t already, check out what Megan Daniels is doing at MX8 - full disclosure: I’m an investor - to me, it’s a great example of how AI should be applied in our industry. It’s all about making our tools work harder for us, not about cutting human intuition out of the loop.