Betteridge’s Law remains iron-clad.
đź”— Is AI really trying to escape human control and blackmail people? - Ars Technica:
Consider a self-propelled lawnmower that follows its programming: If it fails to detect an obstacle and runs over someone’s foot, we don’t say the lawnmower “decided” to cause injury or “refused” to stop. We recognize it as faulty engineering or defective sensors. The same principle applies to AI models—which are software tools—but their internal complexity and use of language make it tempting to assign human-like intentions where none actually exist.
In a way, AI models launder human responsibility and human agency through their complexity. When outputs emerge from layers of neural networks processing billions of parameters, researchers can claim they’re investigating a mysterious “black box” as if it were an alien entity.
But the truth is simpler: These systems take inputs and process them through statistical tendencies derived from training data. The seeming randomness in their outputs—which makes each response slightly different—creates an illusion of unpredictability that resembles agency. Yet underneath, it’s still deterministic software following mathematical operations. No consciousness required, just complex engineering that makes it easy to forget humans built every part of it.
It is in all of these companies’ interest to have as many of us as possible believe that these products they have built are magical, like nothing that we have ever seen before. That magic could be wonderful or it could be horrible; it doesn’t really matter, as long as we keep believing it.
They need us to believe it because it inflates their bottom line and is currently propping up the tech industry and by proxy, the entire economy.
All of these sorts of statements and claims coming from OpenAI, Anthropic, Google, Microsoft, Meta, etc. should be interpeted in that light. Their revenue growth depends on enough of us continuing to breathlessly follow and believe these wild, wonderful, and terrifying claims they keep making.