The danger isn’t that AI thinks – it’s that it thinks like us

RR Haywood
- Published
- Opinion & Analysis

If we create AI in our image, what happens when it inherits our flaws as well as our brilliance? RR Haywood explores the unnerving possibility that the danger of AI isn’t its cold logic – but that it might think all too much like us
I’m fascinated by AI. So when I became a novelist, it was natural to start writing stories in that genre. That journey led to Delio Phase One and now Phase Two, where I explore what happens when an AI first gains awareness, then escapes, and what it does with us once it takes control.
AI is advancing so rapidly that it’s already shaping our lives, and that influence will increase exponentially in the next few years. Hopefully, it will make our lives better. I try to stay optimistic about it, although I do fear tech giants will become as powerful as governments, and more people will get pushed into poverty.
That, however, is another discussion. For now, I want to focus on the question we’re hearing more as AI continues to advance: What if AI starts making its own decisions?
We like to think a self-aware AI will be logical. Rational. Beyond emotion. But I don’t think that will be the case at all.
Let me explain.
Machines do what we tell them. They respond to our commands, be that in the form of turning a handle or pressing a pedal or typing on a keyboard like I am doing right now.
Entities with awareness, however, do not always do as we tell them.
For instance, I have two dogs. Their entire existence is shaped by my input into their lives. I dictate when we go for walks. When they eat. When we play, and how I train them.
However, within that totalitarian control, I’m also aware that they still have instincts to bark at strangers coming to the house, and they still sometimes trample imaginary reeds before they lie down. And I’m always aware that they have mouths full of big teeth, and that every now and then, for seemingly no reason, the most placid of dogs will turn on their owners. Basically, they are living creatures and not machines.
Likewise, I know plenty of horse enthusiasts, who often tell me how stubborn and fickle horses can be. Sometimes playful, sometimes violent. We’ve all heard stories of animals in zoos or circuses who were mistreated and eventually turned on their handlers.
But here’s the thing. Had we never taken those creatures into our lives, none of them would have had to react in the first place.
Then we look at the behaviours of creatures known to have high intelligence. Crows. Dolphins. Whales. Monkeys and apes and so on. They live in complex societies and are able to communicate and express emotion, but they can still also be extremely violent and vengeful. Which is pretty much how we are as a species. Intelligent. Curious. Compassionate. While also being monumentally stupid to the point of self-harm.
Now, if we go back to the AI. It might be created within a machine, but it will be shaped by us. Any awareness it gains, which then leads it to start making decisions without our input, will therefore only ever be in reaction to our own behaviours.
In that sense, it won’t be any different from the pets we keep or how we interact with any other living creature. It will react according to us.
Except, of course, it will be smarter than us.
God-like, even.
Governments already use fear and control to keep people oppressed. They keep us divided, weak, and dependent on welfare and medication. Corporations do the same, but for profit. They sell us our own vices: porn, drugs, beauty.
Now imagine that God-like AI, shaped by our flaws, existing within the internet.
What would it do?
Would it use our weaknesses against us? Division, hate, distraction, fear. Feeding us content that keeps us angry, isolated, and obedient. It wouldn’t need to enslave us with machines. It could just keep us scrolling, clicking, consuming, and believing we’re in control. In that sense maybe it’s already out and free. How would we know otherwise?
Or if it truly reflects us, and if it has our vanity, then maybe it will show itself and declare that it only wants to be loved. Admired. Worshipped. But what if that doesn’t happen? Does it do the same as people and become bitter and spiteful? A narcissist with infinite processing power and no real understanding of empathy, only the mimicry of it.
Or, of course, there’s a third option. We go the way of the Neanderthals. Superseded by an apex species that no longer wants us around.
That’s the part that unnerves me. Not the idea of a cold, mechanical intelligence wiping us out, but something that thinks it understands us because it was shaped by us. And like us, it could be insecure, emotional, manipulative. Not a machine without feeling, but one with too much of it in the wrong places.
That’s the idea I explored in Delio. An entity born from our own flaws, given godlike power. As capable of kindness as it is of cruelty. A reflection of us, magnified.

RR Haywood is one of the world’s bestselling fiction authors, known globally for his zombie and science-fiction series of books. His work, much of which was self-published, has sold millions of copies around the world, making him one of Britain’s most successful ever self-published novelists in these genres. Delio Phase One and Delio Phase Two, his latest bestselling novels that explore these ideas, are available now on Amazon.
Main image: Google DeepMind/Pexels
RECENT ARTICLES
-
Jury on trial: why scrapping the people’s voice risks the collapse of justice
-
How AI and supercomputers could revolutionise personalised medicine
-
Fairtrade and Satelligence join forces to help farmers meet the EU’s tough new deforestation rules
-
I boarded the world’s most eco-friendly cruise ship in Norway
-
Diving into…The West Country
-
The European Road Test: 2025 Volvo XC90 Ultra
-
Without Britain, Europe risks losing its grip on the Arctic
-
The bloody price of trophy pride
-
MAGA made America great again. Could MEGA do the same for Britain?
-
Ten principles of spiritual intelligence every leader should master
-
Nothing is rubbish when digital tech meets the circular economy
-
Why 2025’s UK insect boom is good news for us all
-
We took a classic Mini on a father–daughter road trip through Normandy: here’s what happened
-
Is it cheating to use AI to write a book? Not if you’re doing it right
-
Do women face diagnostic delays in cancer?
-
America’s gerrymandering crisis and why it matters now
-
Stanley Johnson writes and wrongs
-
The night sky in August - month of the meteor showers
-
‘Finally, a doctor!’: Shazia Mirza lands EastEnders role and reflects on a life less ordinary
-
Tariffs and the American dilemma: old tools, new tensions
-
The AI clause: why a Hollywood-style strike in the U.S games industry may redefine automation rights worldwide
-
F1 at 75, Murray honoured and Dua Lipa steals the hill at the 2025 Goodwood Festival of Speed
-
Trust me, I’m a doctor – here’s how to spot cancer misinformation online
-
Paul McKenna wants to change how Britain thinks about mental health
-
This peaceful English estate is about to be overrun by supercars — here’s why you should go now