Biological Basis of Morality
A letter to Nature published yesterday by researchers from the University of Iowa and Harvard University (including Dr. Antonio Damasio, author of "The Feeling of What Happens" and "Decartes' Error") have shown that a particular region of the brain, the ventromedial prefrontal cortex, is responsible for mediating our moral judgments.
Specifically, for transforming what would be a simple utilitarian ethical calculus into something more akin to the Moral Razor. In the study, people with injuries to this part of their brain were more willing to sacrifice the lives of their fellows in order to save a larger number, even if it meant flat-out murder.
The ventromedial prefrontal cortex is a region which is thought to allow for social emotions, like compassion. These emotions seem to be essential for correct moral decisions in high-stakes situations; they seem to provide the moral foundation for higher-level ethical analysis which preserve moral symmetry through subconscious feelings.
According to Dr. Damasio:
This area, when it’s working, will give rise to social emotions that we can feel, like embarrassment, guilt and compassion, that are critical to guiding our social behavior. A nice way to think about it is that we have this emotional system built in, and over the years culture has worked on it to make it even better.
Post a Comment
9 Comments:
Zachary,
The example says you would be throwing an injured person overboard "who would not survive in any case."
I don't see the moral dilemma. Throwing someone over who "would not survive in any case" to save the lives of others would not be "flat-out murder," it would be the only logical option. To NOT throw them overboard would be to commit mass-murder (since the boat would sink and kill everyone).
If you were going to pick a healthy passenger at random to throw overboard, that would be a different story. At that point, asking for volunteers or drawing lots would make sense. But doing nothing and letting everyone drown would still not be a moral option.
Maybe the example question was just poorly worded?
No, I think the question was pretty specifically worded. These moral quandaries rise or fall on the details.
The fact that you don't see a moral dilemma where most people do suggests that it's likely your ventromedial prefrontal cortex is not functioning as it should.
Cool. I guess that's a part of the brain I wouldn't want clouding my judgment.
Isn't the idea of deriving morals from what "most people see" as a moral dilemma an argument from popularity? It's pretty easy to see what would be the best objective outcome in the hypothetical scenario, regardless of people's personal feelings. It's similar what happens in triage--and every doctor in large-scale trauma care has to deal with those life and death decisions.
One injured person in the boat dies who would have died anyway, and a dozen or so people are saved. What could be more clear?
What part of a person's brain would have to be functioning properly to have the balls to make such an expedient but life-saving decision under difficult circumstances?
Well, I don't think that anything about this study pretends to dictate any proscriptive moral system. It's just explaining why most people are instinctively reluctant to make any moral choice that violates the sense of empathy that is generated by that part of the brain.
On the one hand, these moral situations do amount to a "triage" situation, where it's ridiculous to hold anyone to any kind of moral standard when the circumstances are so extraordinary and stressful. On the other hand, these situations do seem to point to moral contexts which might be informative about the way we evaluate morality in everyday life (in your case, Sean, you seem to be fairly rigidly utilitarian).
Under normal circumstances, I would not be utilitarian. I support strict principles of non-coercion. I certainly don't think anyone has the obligation to give their life for others.
But in a circumstance where the choice is certain death for everyone in a group, versus certain death for someone who would die anyway, I think the moral answer is unambiguous.
By the way, I'm not disagreeing with you in principle. I do accept that morality has a biological basis. And I'm sure brain damage would indeed affect this innate sense. I just think this is a particularly poor example of what a 'healthy' moral choice would be.
That's interesting that you say that under normal circumstances you wouldn't consider yourself to be so utilitarian- according to the study, when posed more everyday moral questions, people with ventromedial dysfunction answered pretty much the same as people without the dysfunction.
It was only when answering questions about extreme situations (such as that given in the example) that people with ventromedial dysfunction answered significantly differently from the rest of the population.
The point of the study was to determine what, if any, role the ventromedial prefrontal cortex plays in making moral decisions. This area of the brain is responsible for social emotions, such as empathy. Given this, the sample question illustrates that people with a healthy sense of social emotions will tend to regard throwing the injured person overboard as an immoral act, whereas people without this sense will view it as a perfectly reasonable thing to do.
So, this really isn't a situation in which arguing the moral merits of your decision are relevant. In the context of the study, the fact that you view the solution to be "unambiguous" strongly suggests that your sense of social emotions are not as developed as in most other people, who would view the exact opposite solution to be just as "unambiguous."
This is not to say that you're a bad or immoral person- it's nothing more, biologically, than commenting on a person's color blindness. But if it were me, I think I would prefer to know that other people see nuances to color where I do not, since that would probably affect the way I make certain decisions.
"the fact that you view the solution to be "unambiguous" strongly suggests that your sense of social emotions are not as developed as in most other people, who would view the exact opposite solution to be just as "unambiguous.""
Really, I know this is only an example. But I'd like to see a group of people with 'normal' ventromedial prefrontal cortices tell me with straight faces that everyone in the boat should die, instead of one injured person.
Frankly, I don't believe anyone would actually say that if it were presented properly. I question the accuracy of this study, and I'd like to see it duplicated.
If there was a chance to save the injured person's life, and the outcome were uncertain, maybe you could make an argument for trying to spare their life, and risking the lives of the other passengers. But the way it's stated, we are told the person will die anyway.
It's a two-way, simple-choice hypothetical. One life (which is not really a life, because we are told they are dying), or a dozen healthy lives.
I just don't get it. My brain is broken.
Well... for the record, I'm firmly in the "no" category on this one. Regardless of the utilitarian implications of the situation, I can't escape the fact that it's murder, and thus violates the Moral Razor.
Yes, Zachary expresses this well. Murder is murder, regardless of cold utilitarian calculations.
The scenario, as presented, is a purely hypothetical and unrealistic scenario. Therefore it has no moral bearing on anything.
<< Home