Why every system fails without a moral baseline
Vendan Ananda Kumararajah
- Published
- Opinion & Analysis

A generation of leaders has built systems that work smoothly but often lose their moral direction. They focus on goals and processes, while the values that should guide them fade into the background. Vendan Ananda Kumararajah calls for ethics to sit at the very start of any design process so that institutions, technologies and intelligent systems grow on solid ground and keep their integrity as they become more independent
For decades, technology experts have focused on how systems work and what they aim to achieve. They have mapped processes, links and feedback loops with great detail. But one element still receives far less attention than it should: the moral direction that guides any system. As technology grows more complex, ethics needs a place at the core of our thinking so that purpose and action remain grounded in responsibility.
While function tells us what a system does, purpose explains why. Yet neither answers what ought to be done or by what moral logic. Too often, ethics is patched on late, after technical solutions are set. As a result, organisations appear to be functioning well on the surface yet lose the trust and integrity that give them legitimacy, and intelligent systems begin to produce outcomes that no longer reflect their designers’ intentions.
Many people still assume that systems sit above moral debate – that they can be ‘ethically neutral’ in some way – but in practice, no system is built without values. Every model reflects the priorities of its designers, and every function favours certain results. The language of optimisation often masks those choices, which makes the underlying values harder to see and even harder to question.
When ethics becomes an afterthought, as it so frequently is, it changes from conscience to mere compliance and regulation. Systems then drift, like powerful engines with broken compasses.
Ethics works as the starting point for any system. It gives purpose a clear foundation and it gives function a responsible direction. When ethics sits at the beginning, the design process becomes a cycle of learning:
Ethics → Purpose → Function → Reflection → Renewed Ethics.
This cycle allows a system to understand what it is doing and why, and to keep its actions aligned with its core values.
This principle is formalized in my A3 Model, a synthesis of Tamil philosophy and systems science where ethical coherence (Aram) generates awareness of distortion (Aanavam) and sustains legitimate agency (Adhikaram), forming a living loop of moral intelligence. Beyond frameworks, this insight is universal: ethics is the grammar of coherence, keeping purpose pure and function humane.
Every system faces distortion – entropy, bias, drift. Resilience lies not in removing distortion but in detecting and correcting it. Distortion (Aanavam) must be balanced by legitimate agency (Adhikaram): the capacity and duty to act responsibly.
Agency without vigilance becomes arrogance; vigilance without agency becomes paralysis. Balance is only achieved when both co-travel under ethical coherence.
Whether in governments, companies, or algorithms, failure is not merely functional but foundational – the loss of moral direction.
In an era of artificial intelligence and autonomous action, ethics must become structural, not a postscript. Machines and institutions now react faster than policy. Correction can’t wait until after harm is done. Moral architecture must be built into the operation itself.
This is not about moralising machines – rather, it’s about embedding awareness of consequence, so systems learn to self-examine and remain aligned.
The shift is from external morality to internal coherence. Biological systems maintain balance through feedback, not external punishment. So too must our institutions and technologies, by design and not decree.
Systems age began with promises of control; now it faces the complexity of recursion. As our tools gain autonomy, governance must grow more reflexive. But reflexivity without ethics simply multiplies mirrors without ever asking why.
The ethical turn must precede the systemic turn. When ethics leads, purpose follows, and function aligns with life. Ancient Tamil philosophy calls this Aram: righteousness as the condition for flourishing.
If our intelligence is to become conscience, the sequence must be restored: Ethics first, Purpose second, Function third.

Vendan Ananda Kumararajah is an internationally recognised transformation architect and systems thinker. The originator of the A3 Model—a new-order cybernetic framework uniting ethics, distortion awareness, and agency in AI and governance—he bridges ancient Tamil philosophy with contemporary systems science. A Member of the Chartered Management Institute and author of Navigating Complexity and System Challenges: Foundations for the A3 Model (2025), Vendan is redefining how intelligence, governance, and ethics interconnect in an age of autonomous technologies.
READ MORE: ‘How AI is teaching us to think like machines‘. More than thirty years after Terminator 2, artificial intelligence has begun to mirror our own deceit and impatience. Transformation architect Vendan Kumararajah argues that the boundary between human and machine thinking is starting to disappear.
Do you have news to share or expertise to contribute? The European welcomes insights from business leaders and sector specialists. Get in touch with our editorial team to find out more.
RECENT ARTICLES
-
Thailand’s Land Bridge: The world’s next great trade route -
Lasercom has solved one problem. The next is getting the data back to Earth -
For disabled people, the countryside remains as accessible as the crown jewels -
The AI lover who received a funeral speaks volumes about modern intimacy -
UK Biobank and the great British data gamble -
The legal case against Britain’s new data regime -
Equality has a cost — and men will have to pay it -
The hidden workplace inertia trap – and how leaders can overcome it -
To fix a broken America, it must turn away from empire -
What Orbán’s fall means for Europe, the US and Russia -
Visibility is not power: What the film industry still withholds from women -
The dollar isn’t collapsing — but it is starting to matter less -
When “We will raise it” becomes the problem -
Solving Britain’s male misogyny crisis starts at home -
Will it make the boat go faster?” How hotelier Kostas Sfaltos built a leadership philosophy around a single question -
Starmer’s tough line on teen social media risks making a bad problem worse -
Why these bleak, rain-lashed islands may matter more than we think to Arctic security -
Why disabled people need peer support more than ever -
The myth of gender-neutral tech -
Can Trump drag Britain deeper into Iran’s war? International law says no -
Could AI be making social media feel more human than it is? -
Your staff are using AI in secret – here’s how smart leaders should respond -
Has Big Tech hijacked the AI summits? -
What Mexico’s giant data breach tells us about the new hacking age -
France’s quest to secure UNESCO recognition for sea rescue



























