Where Ethics Meets Science: Which Part of the Brain Controls Morality?

Every culture we know has developed some form of a moral code. Even though many such norms differ from the European tradition, they still serve as rules which govern social behavior. The omnipresence of morality provokes questions about its ultimate origin. Where can it be found? Does it lie in the brain?

Morality seems to make sense only in relation to mankind: we use its concepts to evaluate our own deeds but find it odd to qualify the actions of animals as good or bad. However, other primates also demonstrate behavior which could be interpreted as manifestations of human-like morality. For example, a clash between two male chimpanzees from one group urges the females to try and reconcile the feuding individuals. Wherever possible, they even prevent the fight by taking stones away from the males’ hands. Chimpanzees probably feel that internal conflicts simply weaken their group.

The Brain’s Moral Center

Primates have the best developed prefrontal cortex, with humans taking the top spot. This is the youngest region of the brain in evolutionary terms. Its main activities include decision making, foreseeing behavioral consequences, and restraining impulsive reactions. Experimental research and case studies indicate that the prefrontal cortex is actually the sought-for center of morality.

In the 19th century, Phineas Gage was among the thousands of workers building the American railroads. One day, he suffered an accident at work when blowing up rocks: an uncontrolled explosion sent a metal bar through his skull. Although he did survive and retained his intellectual capacities and physical strength, his behavior changed completely. Previously a gentle and good-natured man, he became an impulsive and brusque individual who refused to observe social norms and seemed to have lost his “moral sense.” His astonished friends maintained that he wasn’t himself any more, and the railroad company ultimately fired him. Decades later, science revealed the reason for such a stunning transformation: the metal bar had pierced right through Phineas Gage’s prefrontal cortex.

Radical behavioral changes were also observed in patients who underwent lobotomies—a controversial neurosurgical procedure that involves severing the connections between the prefrontal cortex and the remaining part of the brain. Lobotomy was once assumed to improve the condition of patients suffering from severe schizophrenia, and some of them did experience relief, but their behavior became similar to Gage’s. They sank into apathy, lost their social inhibitions, and no longer observed the rules of social conduct.

You may also like:

The Difficulty of Moral Judgments

Morality makes an interesting subject of experimental research, where participants are typically confronted with a real-life problem entailing a difficult choice between two options—a moral dilemma. 

Imagine that an enemy army has entered your village to kill all its residents as commanded. You have hidden in the basement of a large building together with the other people, and you can already hear the soldiers approaching to search the building upstairs. Suddenly, your child begins to cry, so you cover its mouth with your hand to muffle the sound. If you withdraw your hand, the soldiers will hear the child crying, and they will kill you, your child, and all the other people hiding in the basement. This means you must smother your own child to save yourself and the others. Would you do that?

There is no good decision in such a dramatic situation; all one can do is choose the lesser evil. Some respondents ultimately side with the residents of the village, believing that one person can be sacrificed to save many—this kind of moral judgment is referred to as utilitarian. Many people, however, have qualms about it because of compassion, and the necessity of smothering one’s own child with bare hands makes the decision even harder. 

The dilemma above was actually used in a real study which included subjects with a damaged prefrontal cortex. Interestingly, they chose the good of the community and opted for the utilitarian judgment much more often than other participants. Still, the difference in question appeared only when studies presented a dilemma which aroused strong emotions. The subjects with the damaged “center of morality” were free from the influence of impulses and hence able to calculate the profits and losses of either decision. Consequently, they reached the obvious conclusion that it was simply more “profitable” to save the entire group of people. The participants with undamaged brains found the choice much harder because their feelings struggled with reason.

Such results correspond with the theory put forward by Jonathan Haidt, according to which humans don’t actually pass moral judgments in a controlled and well-thought-out manner. Instead, our decisions are made automatically and emotionally, and their rational justification usually follows later.

We recommend: Is Ethics Relevant in Politics?

What Happens When We Mete Out Justice?

In a classic game of trust, two people work together to achieve a mutual financial goal. At the end, one of them can either sweep up all the money they have earned or divide it in half and share it with the other. One study has checked what happens in the brain of the betrayed subject who is left empty-handed but receives a permission to punish the other player by deducting some money from the final sum. Interestingly, it has turned out that such behavior activates the brain’s reward system. This suggests that we perceive personal punishment of cheaters as morally right or at least pleasant.

Does it mean that humans are righteous to the core? Sadly, not at all. In one study, the subjects were asked to inflict pain on people of various races. As they did that, their brain regions responsible for empathy were less active while they watched the pain of a person with a different skin color. The tendency proved especially prominent in the subjects with an unconscious racial bias. Evidently, human morality can be selective and is still strongly rooted in the atavistic dichotomy known as “us versus them.”

Neurotransmitters at Work: Serotonin and Morality

Like the entire brain, the centers responsible for moral behavior would not function effectively without neurotransmitters such as serotonin. Research has shown that serotonin levels are often considerably lowered in persons who commit asocial or aggressive acts.

Serotonin plays a vital role in the regulation of mood (its levels are low in people with depression). It also inhibits sudden impulses and is responsible for harm aversion. In one study, the subjects were given medicines increasing the brain levels of serotonin and then confronted with classic moral dilemmas. The elevated concentration of the neurotransmitter in question made them opt for utilitarian decisions far less often. In other words, they were reluctant to agree to harm one person even if such a decision allowed them to save many.

The experiment also shows, albeit indirectly, how crucial it is to take care of one’s own mental wellbeing. With high levels of serotonin, we feel a considerable aversion to make others suffer. This observation is of fundamental importance regarding the golden rule of ethics which we probably all support, whether more or less consciously: Do not treat others in ways that you would not like to be treated. And if negation isn’t up your alley, the most famous affirmative version of this principle was aptly formulated by Jesus: “As you wish that others would do to you, do so to them”.

Racial Empathy Grid — chatPGT / Midjourney / Maciej Kochanowski

Published by

Szymon Cogiel

Author


He became a psychologist to better understand the characters in the books he writes. For as long as he can remember, he has been fascinated by man and his place in the world.

Want to stay up to date?

Subscribe to our mailing list. We'll send you notifications about new content on our site and podcasts.
You can unsubscribe at any time!

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Zmień tryb na ciemny