Going on a Moral Diet

Published by timdean on

It’s a well established fact that there is no such thing as a free lunch. This is particularly the case when that lunch consists of deep fried chicken followed by a couple of glazed doughnuts and a Coke.

We love sweet and fatty foods (although, as Dan Dennett points out, we don’t desire them because they taste good, they taste good because we desire them). And even though they’re contributing to an epidemic of obesity today, it’s a damn good thing that we do love the sweets and the fats. Because had we not vigorously pursued such energy-rich sources of nutrition throughout our evolutionary past we may not have made it to the point where today’s obesity epidemic was even an option.

hamburgerSimply put, an evolved taste for sweet and fatty stuffs – in the form of a strongly reinforcing sensation of pleasure in response to exposure to sweet and/or fatty foods – was adaptive because our highly active endothermic bodies with their calorie-burning brains required vast amounts of fuel to keep them hunting and gathering and surviving and reproducing, etc.

And up until the last few moments of our evolutionary history, we were far more likely to be undernourished rather than overnourished. That being the case, the cost of getting it wrong and consuming too much energy was lower than the cost of getting it wrong and not consuming enough. Hence a selective pressure in favour of our sweet/fatty tooth.

Yet today we can see all this. We can acknowledge that allowing our evolved sweet-and-fat-seeking psychological impulses take over can lead us to unhealthy ends. The heuristic is now pointing in the wrong direction. We understand that we need to inhibit our evolved impulses and steer our behaviour towards more appropriate ends in today’s environment. We understand that sometimes we need to consciously manage our diet.

Maybe we need to do the same when it comes to some of our evolved moral impulses.

Like our sweet/fatty tooth, we evolved a number of psychological mechanisms that have helped us to live more successfully as a part of social groups. These include things like our moral emotions – empathy, guilt, righteous anger. We also have a tendency to rapidly identify people as members of our in-group or out-group, and treat them differently.

We have an innate inclination towards fairness, backed up by an inclination to punish those who cheat and act unfairly. We even have an evolved tendency to produce, spread and conform to behavioural rules, punishing those who deviate from them.

These tendencies don’t produce all moral behaviour – culture plays a leading role in informing our moral beliefs and directing our moral judgements – but these evolved heuristics do contribute a great deal to how we perceive and respond to the world on a moral level.

However, like our sweet/fatty tooth, these mechanisms and heuristics may well have been adaptive at various times in our evolutionary past, but they can often misfire and produce behaviour that is disruptive to harmonious social living today.

For example, our tendency towards tribalism is deep rooted. We are highly sensitive to subtle markers that identify an individual as “one of us” or “one of them.” Even something as slight as a difference in inflection or dialect can flag someone as an outsider, let alone more conspicuous markers like clothing or skin colour. This in-group/out-group detector was likely a useful tool when tribes and bands competed over territory and resources, and trusting the wrong person could have fatal consequences.

However, today our tribal instincts contribute to a great deal of out-group discrimination, vilification, racism, xenophobia and even outright violence – even when those outsiders post little or no threat. There have been studies showing that we all have an innate preference for those who appear to be more like us, and an innate bias against those who appear different. This doesn’t mean we always act on this bias – indeed, we have other cognitive mechanisms that kick in and sometimes correct for this bias before we act. But these other mechanisms don’t always work.

We also have a tendency to punish those who we feel have wronged us or someone we consider to be morally worthwhile. This, too, may have been adaptive in our evolutionary past. Many cooperative endeavours involve an element of trust. You help me today and I help you tomorrow. That way we both benefit. However, there’s a temptation for me to accept your help today and to conveniently disappear when it’s my turn to reciprocate. Similar dynamics occur across many interactions, as modelled by game theory.

There is evidence suggesting that we are particularly sensitive to such dynamics, and we are quick to experience outrage and seek to punish those who cheat or free-ride. This punishment tendency has been shown to help regulate and promote cooperative interactions across a population by weeding out free-riders, or converting them into cooperators. But the heuristics behind the punishment can also go horribly wrong. If someone acts unfairly (deliberately or accidentally), they can be punished, and might then retaliate themselves, leading to a tit-for-tat cycle of punishment – i.e. a feud.

Our tendency to produce behavioural norms and enforce them through punishment is another effective mechanism for promoting social and cooperative behaviour. However, the perceived strength of these rules – and the fact that “moral” norms tend to be perceived to be more important than, and to override, other norms – can lead to a kind of objectivism about morality.

We tend to see the moral rules we live by as being the “right” ones, and see others as being “wrong.” Even if that other moral system might be quite effective at promoting social and cooperative behaviour in its own right, the fact it’s different from ours can cause us to lash out and attempt to “correct” their behaviour by punishing their perceived moral transgression. Religious and ideologically inspired conflict ensues.

Not all of our moral inclinations are bad for us. But some are. Like our sweet/fatty tooth, we can now see how our evolved moral psychology operates, and we can decide whether it’s leading us to behave in ways that advance harmonious social life or whether it is proving corrosive and causing conflict. If the latter, we can decide to go on a moral diet.

Evolution got us to where we are today. But there’s nothing that says we have to take the evolved heuristics of old and run with them into the future. If we decide that we value social and cooperative living, that we value harmony and want to avoid unnecessary conflict, then we can choose to inhibit some of our evolved moral inclinations.

We already do that to some degree through our existing moral systems. Norms that encourage hospitality to outsiders, that encourage us to suppress our innate bias or institutions that do the punishing for us rather than allowing us to spiral into feuds are all examples where we have overruled our evolved inclinations.

However, when it comes to our tendency towards moral objectivism, or the popular calls for harsher retribution in sentencing, our evolved moral inclinations still carry a lot of weight. Perhaps it’s time to cut those moral calories and choose to dine on a leaner and more harmonious fare.


1 Comment

Mark Sloan · 27th March 2014 at 12:10 pm

Tim, glad to see you post again.

Not much to disagree with here.

However, we should be careful not to throw out the baby (strong biological motivation to act unselfishly in in-groups) with the bathwater (unfairness toward out-groups). (Of course, you did not advocate that, but some readers might think it was implied.)

Most of our evolutionary history was spent in small groups whose survival often depended on successful competition with out-groups. So our in-group versus out-group markers and the universal ‘moral foundations’ based on them (in Jonathan Haidt’s terminology) of loyalty (to an in-group), respect for authority (an in-group’s authority), and purity (an in-groups definition) can be highly motivating for self-sacrifice and punishment of ‘wrong-doers’. (In the US, the conservative Republican party is well known for shamelessly exploiting these ‘moral foundations’, though I doubt they understand the science behind their power.)

Given this evolutionary history, I see fairness between in-groups and out-groups as a critical moral norm for defining moral codes most likely to achieve social goals such as increased overall well-being.

That is, Peter Singer makes a serious error (with awful consequences if actually implemented) of advocating, as I understand him, abolishing out-groups and counting every one (and even other animals?) as worthy of equal consideration within the same in-group. The abolishment of in-groups who merit special consideration such as families, communities, and even nations would be, I think, a disaster due to reduction in biology based motivation to act morally (unselfishly).

But by extending fairness, an in-group cooperation strategy, to apply also to in-group to out-groups interactions, we might do well and be much more likely to actually accomplish Singer’s goals than by eliminating the in-group versus out-group distinctions.

By fairness I mean rules for interactions between in-groups and out-groups in the sense of John Rawls’ fairness.

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *