In this five-part series for Critical Mass, Luke Andreski, author of Intelligent Ethics and Ethical Intelligence, explores the nature of morality and what that tells us about how we should act in an amoral world. In his second article he takes a look at the purpose of morality. Does morality have a function or point, and, if so, what might it be?
Here’s something you need to know about morality:
It’s not a belief system, an ideology or a religion.
It’s a tool.
A sort of machine.
Tools, algorithms and machines are objects that have a function, a purpose.
To a large extent, what they’re for is what they are.
Which brings us to the question I’d like to explore in this article…
What is morality for?
Gadgets and gizmos
Some tools or machines or algorithms help us make things.
Could that be what morality is for?
Well, not really.
Churning out gizmos and gadgets, products or consumables isn’t really morality’s forté.
How about manufacturing rules? Well, we’ll come to that in a moment. Even if rule-production were part of morality’s function, we’d still be asking what’s the purpose of the rules?
Let’s look for something better.
Some tools or machines help us do things.
Does morality help us in that way?
But a lot of stuff helps us do things.
Hammers, spanners, electric saws.
Or adrenalin. Adrenalin helps us react quickly, or win a race.
Or, more cognitively, courage. Courage helps us overcome obstacles, face down our fears.
Even if morality helps us do things, it’s not its defining purpose. Helping us do things might be something morality does, but that’s not a sufficient criterion for being morality.
How about entertainment?
There are numerous devices or artefacts which are aimed at entertaining or diverting us.
Plays, songs, dance.
But morality doesn’t feel like that sort of thing. In fact, it’s rarely entertaining… and it’s far more important than a diversion.
It would be a long stretch to suggest entertainment or diversion could be the primary purpose of morality. Is it even worth going there?
How about something more complex?
Some tools are cognitive or psychological, made up of concepts or techniques.
Propaganda, for example, is a cognitive tool and its purpose is to manipulate us.
So could the purpose of morality be the same?
Is it – as it has often seemed in the past – a tool for manipulation or control?
Well, that’s problematic.
You see, for morality to work, it’s essential you are free.
If you aren’t free, morality doesn’t apply. Nothing is your fault. You’re just an automaton or a slave, obeying someone’s else’s rules or commands.
In fact, morality and control are inversely proportional.
The more control is exerted over you, the less responsibility you have for what you do.
So, given morality’s requirement for freedom, to suggest morality is about control is a contradiction in terms.
A destabilising force
How about stability?
Is morality’s function to keep communities or societies stable?
There’s a problem with this answer, too.
What if the stability experienced in a society embeds hierarchy, privilege, exploitation or control?
How about a stable slave-owning society?
Morality’s intrinsic requirement for individual freedom would necessarily destabilise such a society.
So morality’s not about stability or control – and we’re still looking for an answer as to what it’s actually for.
Dodging the question
Does that fit the bill?
Guidance is a good answer, and we touched on this earlier in relation to the production of rules.
Offering guidance, defining what we ought or ought not do, is a central characteristic of morality. It’s one of morality’s seven necessary characteristics as outlined in the previous article in this series. Guidance? Who would say otherwise? That’s definitely something morality does.
But ‘Morality is for guidance’ is not quite sufficient as an answer.
Quite a few things offer guidance.
We’re still left asking, “To what end? What’s the guidance for?” An object’s purpose is one of Aristotle’s four causes. There are many things you can’t really define or categorise until you know what they’re for – or, as Aristotle terms it, unless your understand their final cause.
If we answer the question, “What’s morality for?” with “To offer guidance,” all we’ve done is dodge the question.
Bringing us together
How about cohesion? The cohesion of communities, of nations? Could that be what morality’s for?
It’s a great suggestion, and brings us close to an answer.
After all, a rational, shared morality seems ideally suited to helping societies cohere.
To act effectively as a society, an economy, a business or a movement, we need to be able to take people on their word, to trust them, even if we’ve never met, even if they live on the opposite side of the world.
To work together, to achieve anything as a group or a civilisation, we need to be able to trust each other – and the best guarantor of trust is morality.
A moral person has integrity, doesn’t lie, keeps their word, respects your interests.
You can trust a moral person.
And, if you are moral, they can trust you too.
So morality helps us trust one another, helps us cohere, offers guidance which, if shared, assists us in working together toward mutual goals.
It’s like a glue that binds a society together.
It’s like an oil that helps a society run.
But we still have a problem.
The internal cohesion of a specific group or a community, of a nation or even a civilisation, conflicts with a necessary characteristic of morality: the characteristic of universality.
The universal character of morality transcends borders: the borders between me and you; the borders between my family and yours; the borders between nations; even the borders between civilisations.
Take any two civilisations. Morality applies to both of them. Whichever civilisation you’re in, the killing of innocents who don’t wish to die is wrong. Slavery is wrong anywhere, at any time.
So if morality is about cohesion, then it’s not just about the internal cohesion of a group or a nation or even a civilisation.
It’s about the cohesion of all groups, all nations, all civilisations.
It’s not just about unifying a few of us. It’s about unifying all of us: all autonomous, sentient, self-aware beings.
Lord of the Rings
Morality’s inherent characteristics, its universality, its recognition of individual freedom, its offering of guidance and a basis for trust, all drive us toward this conclusion: that morality is a cognitive tool for generating harmony between sentient beings, no matter who, what or where they are.
It’s a unification machine with the purpose and function of bringing together self-aware, self-determining sentients, whether they’re part of a tribe, a village, a community, a nation, a species, a conclave of self-aware AIs or multiple sentient species.
This is it’s great power, its purpose and its utility.
To use The Lord Of The Rings as a metaphor, morality is the one ring to find us all, and through our free choice bind us.
In the next article in this series, published on Critical Mass tomorrow, Luke Andreski looks at the issue of moral authority. What gives us the authority to say that something is good or bad, or what anyone should or should not do? What justifies the moral imperative?
Luke Andreski is a founding member of the @EthicalRenewal and Ethical Intelligence collectives. His books include Intelligent Ethics (2019), Ethical Intelligence (2019), Short Conversations: During The Plague (2020) and Short Conversations: During the Storm (2021).
Intelligent Ethics is available here.
Luke is a founding member of the @EthicalRenewal collective and author of Short Conversations: During the Plague (2020), Intelligent Ethics (2019) and Ethical Intelligence (2019).