People are more moral than they think


People may say that when an opportunity arises, they would choose to act morally. The question is, would they really do so? Are people really as moral as they think they are?, researchers ask.

Rimma Teper, Michael Inzlicht, and Elizabeth Page-Gould of the psychology department at U of T Scarborough have just published a study on human morality in Psychological Science, a journal of the Association of Psychological Science.

The study was conducted on 67 U of T student volunteers, divided into three groups, who each had to take a 15-question math test. The first group was promised $5 if they were right on 10 or more questions. They were also told that a “glitch” in the software would cause the correct answer to show on the screen if they hit the space bar, but only they would know if they hit it. The second group was informed of this dilemma of the opportunity to cheat, and then asked to predict whether they would cheat or not. The third group took the test without the chance to cheat.

During the study, electrodes measured the strength of each participant’s heart and breathing rates and the level of sweat on their palms. All of these increase when one experiences heightened emotions. Those facing the real dilemma were at the most heightened state of emotion. These emotions about the moral dilemma apparently influenced them to do the right thing and refrain from cheating, leading the researchers to conclude that emotions influence moral behaviour.

The volunteers asked only to predict their actions were calmer, and were more likely to say they would cheat, the study found. The participants who took the test with no opportunity to cheat were calm as well. On average, people in thepredictor group said they would cheat on five out of the 15 questions. In the moral dilemma group, on average, members cheated on only one question. Their physiological responses were also higher than the dilemma group.

However, “If the stakes were higher—say, the reward was $100—the emotions associated with that potential gain might override the nervousness or fear associated with cheating,” said Teper in a statement.

A few previous studies have found just the opposite: people do the right thing less often than they predict.

“This time, we got a rosy picture of human nature,” said Inzlicht, an associate professor of psychology, in a journal news release. “But the essential finding is that emotions are what drive you to do the right thing or the wrong thing.”

Emotions appear to be the “missing link” between moral reasoning and moral actions, particularly fear, guilt, and love. Fear tends to be the predominant emotion, causing people to change their minds at the last second. In this study, the hypothesis is that the fear of getting caught made the subjects refrain from cheating.

While emotions caused the students to make the moral decision not to cheat, emotions could easily play the other way in other situations, she said.

For example, one might have decided to confront someone and tell them the truth and decide at the last second to lie instead.

It turns out people are not good predictors of what they will do when actually placed in the hot seat. When people are contemplating how they’ll act, “They don’t have a good grasp of the intensity of the emotions they will feel,” says Teper, “so they misjudge what they’ll do.”

“I think the take-home message of the study is emotions, whether they’re moral emotions or they’re self-serving emotions, are really what will drive the decision you make,” the researchers concluded.

“We have to be careful when looking at what people say they might do and what people think is right because it might not always translate to real-life behaviour.”

In future research, Teper reveals, “We might try to turn this effect around and see how emotion leads people to act less morally than they forecast.”