The frustrating barrier to moral progress.
I hold some controversial moral beliefs.
I believe people shouldn’t eat animals. I believe in open borders. I believe capitalists taste better with a side of potatoes.
But when I try to convince some people of ethical beliefs, I face an interesting problem: unapologetic disregard.
I’m what philosophers call a moral realist. In other words, I think that there are objective standards of right and wrong, and I’m prepared to defend the views I hold within this framework with some degree of rationality. For example, I believe that people shouldn’t eat animals because it causes needless suffering and harms the environment, both of which are facts that most people deem morally salient.
However, sometimes I meet people who profess to not care about morality whatsoever. These people claim to be true moral skeptics.
While I doubt their conviction (since they probably do act with regards to morality most of the time), I find it particularly annoying to debate with them.
They’re like the dorm-room philosophy bro who smokes weed and always says “Yeah, but how do you know you’re not just, like, in the Matrix, dude?”
I find it very strange that we all agree on mathematical precepts, but when it comes to the (arguably more important) decisions of right and wrong, most people are suddenly mute.
If I encounter someone who believes that 2 + 2 = 5, I can rightly chalk them up as an imbecile, a willing ignoramus, or a child who doesn’t know any better. But if I meet someone who believes that eating animals is alright, then I’m simply encountering a difference of opinion? That’s maddening to me.
I believe that there are universal principles of right and wrong, just as there mathematical principles that can be discovered to be True with a capital T.
That’s not a popular belief, especially because it denies the innate sense we have that there is a degree of moral preference at play when we choose what to believe. I like kale, and you like spinach. I may believe it’s wrong to eat animals, but my friends don’t, and most people think that’s alright.
But actually it’s not alright, and the most frustrating part is that their erroneous beliefs can cause massive suffering in the world.
If we hope to actually change people’s minds on the issues that matter most, we have to persuade people that moral beliefs aren’t based purely in opinion. Morality isn’t like spinach and kale.
Unfortunately, there aren’t a lot of tools people can use to be morally persuasive. I can persuade people of physical truths about the universe by using scientific evidence and logic, but the toolkit for the work of moral persuasion is comparatively weak.
The most powerful method of moral persuasion, however, might actually be a feeling that most people think of as just a quirk of psychology.
Cognitive dissonance is the feeling you get when you believe one thing but do another. It’s the feeling of hypocrisy — the feeling of not acting in accordance with your values.
Cognitive dissonance is not just a little facet of our brains that can be disregarded. Sometimes, cognitive dissonance can be the determining factor in the decisions we make as moral agents.
Returning to the example of eating animals: if you’re a person who is forced to confront the horrors of factory farming, the damning ecological effects of waste runoff, or the disgusting nature of inhumane slaughter on a regular basis, you’re much more likely to feel cognitive dissonance when you eat a Whopper.
Simply confronting the consequences of our moral beliefs can be enough to spark a guilty feeling of hypocrisy, and that can lead to momentous personal change.
The trouble is that people are pretty good at ignoring their cognitive dissonance when it’s not too powerful. What’s more, they can undergo a process of cognitive reconciliation that subdues the dissonance.
For instance, we may believe that it’s immoral for Jeff Bezos to have tens of billions of dollars to spare when he could easily spend that money to eliminate malaria. However, from Jeff’s perspective, he may not feel such intense cognitive dissonance because he’s likely gone through a process of cognitive reconciliation. He may now believe that his money could be better spent, or that he’s entitled to every cent of his wealth, or that people dying of preventable diseases is not his problem.
But to truly change the hearts and minds of the people we need to win over, we have to make their cognitive dissonance alarm bells ring so loudly that they can’t be ignored.
Many people, especially those who believe in moral truths, can be morally persuaded with logic and reason alone. But for those who don’t put stock in logos, we have to rely on pathos to change their minds: an appeal to their emotions rather than their reason.
For these people, it is necessary to have them confront the consequences of their moral choices and make them realize their complicity. It is not enough to express detached condemnation — we need to actively make them acknowledge their guilt. Only then will their cognitive dissonance win out over their cognitive reconciliation and force them to change their behavior.
Until then, we’ll be stuck with people who passively stymie the progress of our moral development as a species, and we’ll be mired in arguments with people who have already plugged their ears.