LIVE EVENT: Measuring Campus Expression

Join HxA and FIRE for this live discussion | July 24, 3-4pm ET

Register
Heterodox Academy
Back to Blog
Disagreement b
February 16, 2018
+Musa al-Gharbi
+Constructive Disagreement

Three Strategies for Navigating Moral Disagreements

This blog is now available in audio format. Check it out on our podcast, “Heterodox Out Loud: the best of the HxA blog.” Narration begins at 1:20.

We in America and Western Europe, and by now many other places in the world, have this idea of people as fundamentally rational. On this account, our profound cognitive abilities are designed to help us discover objective truths about the world through logical argument and empirical observation. Contemporary research in cognitive science, psychology and related fields paints a much different picture.

For instance, we like to think that in the event of disagreement, presenting facts or statistics, or appealing to rational standards, could help establish common ground that we can build from. In reality, unless and until common ground is already established, appeals to these kinds of allegedly objective criteria often polarize people more.

And when threatened or cornered by the evidence, rather than conceding, we often simply kick debates into the moral sphere, where claims become much more difficult to falsify. In these instances, not only does empirical evidence lose most of its force, but even arguments appealing to rivals’ own perceived interests can backfire.

Now, often in elite media circles or academic circles, these tendencies are discussed as the products of ignorance, a lack of cognitive sophistication, or the result of naïve dogmatism. “Those people” resist facts and logic, “they” are driven by prejudices and superstitions. “We”, on the other hand, are open-minded. “Our” beliefs are derived from evidence and facts. We are committed to “the truth” rather than being driven by ideology.

But here again, in many respects, basically the opposite seems to be true:

In fact, the more intelligent, educated or rhetorically skilled one is, the less likely it becomes that someone will change their minds when confronted with evidence or arguments that challenge their priors. There are two big tendencies driving this phenomenon.

The first is that in virtue of knowing more about the world, or being better at arguing, etc. people are just better equipped to find ways of punching holes in inconvenient facts, or finding reasons to justify ‘sticking to their guns.’ Indeed, one becomes more likely to really enjoy arguing – and to engage in political research and arguments as a hobby — as they grow more intellectually and rhetorically capable.

Perhaps surprising, but also somewhat intuitive if you think about it, highly-educated or intelligent people tend to be far more ideological than the general public. They are more likely to be partisan, to be obsessed with some moral-political cause, or to use some intellectual framework or idealized model to interpret the world.

And while educated people may be less likely to discriminate against others on the basis of factors like race, they are significantly more likely to be prejudiced against people who think differently than them, or hold different ideological commitments.

Given that moralized debates are nearly impervious to all the tactics we’ve been trained to rely on — appeals to incentives/ disincentives, appeals to logic or facts, etc. — I am going to detail three strategies that the literatures on moral psychology and cultural cognition suggest for making moralized disagreements more tractable, and debates more productive (click on each for more detail).

#1: LOWER THE PERCEIVED STAKES OF THE DISAGREEMENT OR CONFLICT

The more people see as “riding on” their being right, the less they’re going to be willing to change. So the first thing you should do, if you want to avoid having a conflict escalate into the moral sphere, or to help bring it back ‘down to earth,’ is to lower the (identity, reputational, normative, practical) costs of your opponent admitting that they may be wrong, or that you might be right. There are a few aspects to this:

Never sling pejorative labels. Don’t impute bad motives, either.

Someone need not be a bad person (selfish, sexist, racist, etc.) to disagree with you. They don’t have to be ignorant, stupid, brainwashed or crazy either. Given how complicated and uncertain many issues and phenomena are, for virtually any topic you can imagine, there is room for reasonable disagreement.

When it is instead insinuated or outright alleged that the source of the dispute is instead some negative attribute the other person has, the conversation is unlikely to be productive.

Why not?

Well, first you are taking for granted that your interlocutor is certainly mistaken. If the only reason they hold their position is racism, then it is beyond the realm of possibility that they can be correct – in full, or even in part – right?

This suggests that you, yourself, may be arguing in bad-faith: you want them to change their mind, and to consider that you might be right, but you are not willing to do the same. The person you are arguing with will generally return that sort of energy in kind.

When people sling these labels around they are also setting a very high reputational cost for agreement. If the only way someone can acknowledge a given fact is to implicitly concede that they were brainwashed or ignorant — that their former position was racist, etc. – then as a rule, people are just not going to do that. And it really doesn’t matter what the facts are. Because then it’s not about the facts, it’s about them. And how they see themselves, and how they are seen by others.

So always, always criticize positions rather than people when you can. And never associate holding a position with possessing some kind of mental or character defect.

People should not have to abandon their worldview to agree with you.

Acknowledging the reality of climate change doesn’t mean I have to vote Democrat in the next election, or that I should’ve done so in the last. Or on the flip side, recognizing that a particular state intervention seems to be less effective or efficient than the market doesn’t mean I have to take a neoliberal approach to other issues. One can support religious exemptions without being a conservative or even a believer.

People become much more likely to consider these sorts of possibilities when their broader sense of identity is not being threatened. Therefore, it is imperative to create a climate where your interlocutor can acknowledge a fact, or endorse a position, without abandoning — or feeling like they have betrayed — their identity commitments. More on this soon.

The facts of a case should be agreed upon first. Implications and applications come second.

Often we lump these together, “because climate change is real, we have to have strict regulations forcing the production of more efficient products, or radically increasing the costs of waste.”

It is unwise to argue with a skeptic in this way:

If the extent to which people contribute to climate change was already controversial to the person you are arguing with, if they think accepting climate change means they also have to accept massive social programs and coercive government interventions – that’s going to be a much tougher sell.

To stick with this example: first, work towards agreement about details like the reality of climate change, the extent to which people are driving it, how serious of a problem climate change represents, etc. Then talk about what to do about it, or how best to address it. Is there a viable market solution to mitigating the risk? Might state intervention be necessary? Might a more ecumenical approach be more effective (or even necessary)? Start small and build out.

To decrease the stakes of a disagreement, lower its visibility

With regards to digital forums, one way to lower the stakes sometimes will be to continue the conversation in a less public venue – such as private email, direct messaging (or even a phone call or face-to-face meeting, depending on the situation).

In very public environments there is much more pressure to conform to one’s group, to virtue signal, etc. It is also far more embarrassing to admit you were wrong to the whole world than to a single person.

People are generally much more reasonable in more intimate settings. Therefore, one (often easy) way to lower the stakes of a debate is to decrease its visibility. This can also help foreclose the possibility of mob effects (and prevent derailments by others jumping into the conversation who are perhaps more toxic, extreme, less invested in the conversation or relationship, etc.).

Don’t demand too much from the conversation

Often, people go into conversations with unrealistic expectations of what can or will be achieved. There is an expectation that one side will be converted to the other’s way of thinking, or that they’ll both be swayed a bit and meet somewhere in the middle. This creates needless pressure to see one’s ‘opponent’ give some ground first, lest one ‘lose’ the argument. This is a bad way to go about most conversations.

Realistically speaking, in any one-off interaction with someone, it is highly unlikely they will undergo any major change in their views. In many cases, ‘meeting in the middle’ may be virtually impossible — just as some differences may not be reconcilable — because they don’t turn on questions that can be logically or empirically resolved in any decisive fashion (often these differences are the product of divergent ideological or identity commitments, major life experiences, etc.).

In cases of deep disagreement, the initial and primary goal should be simply to clearly understand where one’s interlocutor is coming from, and to be well-understood oneself.

In many cases, it is a major accomplishment just to walk away from a conversation knowing — in a concrete rather than merely an abstract way — that those on the ‘other side’ of a given issue are not necessarily stupid, crazy, ignorant or evil (see: asymmetrical motive attribution) — that there can be morally and intellectually defensible disagreement on the matter.

Indeed, clearing this initial benchmark is not only important for one’s own learning and growth, it can also facilitate converting critics and skeptics — to the extent that this is still desirable once one has a clearer understanding of another’s point of view (see strategies 2 and 3 for more on this point).

#2: APPEAL TO YOUR INTERLOCUTOR’S OWN IDENTITY, VALUES, NARRATIVES, FRAMES OF REFERENCE WHEN POSSIBLE

There is a style of argumentation that is fashionable, especially in academic and many media circles, to begin claims with a statement along the lines of “as an African American…” or “As a Muslim…”

In principle, this approach is intended to signal one’s credibility and motives for staking a position, while exposing some of the limits of one’s perspective, or some of the biases one may have. Many think that by being front and forward that they are arguing “as a black Muslim” etc. that this can help build trust or understanding between people.

In practice, however, what it does is make the differences between people more salient — and pushes both parties towards positions that are in line with what they are “supposed” to think as a member of whatever groups they identify with. In other words, it polarizes people and renders them less willing to compromise — because the disagreement then is not a logical or empirical dispute, but an identity conflict.

What is much more effective is to begin by appealing to superordinate goals or identities: “We’re both Americans, concerned about what’s best for our country, right?” or “We’re both parents, trying to look out for our children’s future…” etc.

Even more effective? Arguing your position from the ‘other side.’

Speak to people in their own language

If you are trying to convince an atheist that they should follow some course of action because it’s in the Qur’an, that’s not going to be so effective. If you are justifying your position by appealing to the performativity of race, or gender fluidity… really, anywhere outside of a university setting… then you’re just not going to be compelling to most people in most circumstances.

If you want someone to consider your empirical claims, assuming the facts are truly on your side, it’s actually a lot easier to be convincing if you cede the “home court” advantage. Otherwise, one thing you may be arguing about, in addition to the facts, is the framing.

So, for instance, if you are a conservative talking to a progressive, try to explain why, as a progressive, they might be able to find your position compelling.

Some important caveats:

1. You certainly don’t want to argue that THE correct interpretation of progressivism, Islam, or whatever ideology you are engaging with is to embrace the particular conception you’ve derived. That’s presumptuous, especially coming from someone who isn’t part of the group. Your assertion should be much more humble, something like “here is one way to possibly embrace this view which seems consistent with your other commitments.”

2. This requires some leg work. What you don’t want to do is end up caricaturing their position: that, of course, would be offensive. It would show, more than anything, how little you understand, rather than that you do understand them – and how little effort you seem to have invested in grasping their position. It seems transparently manipulative to boot, and this will likely kill any good will they came into the conversation with.

3. On that front: don’t be disingenuous. Never say things you don’t believe in an attempt to come at things ‘from their perspective.’ Don’t pretend to be something you aren’t. That’s unethical and disrespectful — and if discovered, it will totally destroy whatever credibility you might have going forward. The conversation will be effectively dead.

So let’s say you are a progressive who anticipates arguing with a conservative about a particular issue. If you want to engage conservatives’ frames, you will actually have to do some research on conservative views about the matter. What are the arguments they deploy against your position? Is there anything you can find to agree with, or things you hadn’t considered that now seem pretty important? These can be great starting points for building zones of agreement. What, specifically, do I find troubling about their position? Why? Are there conservative dissenters who actually do share my position on this matter (for instance, here are conservative arguments for guaranteed basic income, single-payer healthcare, criminal justice reformenvironmental protection, recognition of gay marriage, and constraining capitalism – for starters). How do they make their case? What is the language they use?

Is it worth the effort?

Yes!

What I’ve just described may sound very demanding and intimidating — but truly, it is one hell of a ride. It will not feel like work: if you do a deep dive into a radically alternative worldview – with an open mind – that mind will. be. blown.

The exploration might at times be disorienting, frustrating, or triggering – but you will learn a lot. You might not abandon your own commitments, but you’ll definitely come to see things in a dramatically different way. At the very least, you will discover that your rivals are not crazy, stupid or evil – they have legitimate reasons for holding the positions they hold on many issues. That in itself – really internalizing that – can be huge.

And I should add, you don’t have to aim for fluency with another view, just competency.

Think of it like travelling in a foreign country: if you are not perfect in observing local customs, or if you fumble around with the language a bit — but people can see that you are sincerely trying, and you put in some effort – they aren’t usually offended by your mistakes. They think it’s charming that you are working to meet them, rather than expecting them to speak your language or conform to your ways of doing things. And so they’ll usually let little things slide, or even help you out with regards to language or customs, etc.

For that matter, it’s likely that the people you interact with won’t speak their native language perfectly; they might not engage fully in all customs, traditions, etc. What this means with regards to engagement across moral or political differences is the following: you don’t have to aim to be an expert on critical race theory, Christianity or whatever other alien frame your interlocutor appeals to — because they likely aren’t hardcore ideologues themselves.

Nonetheless, research shows that people become much more willing to reconsider or even change their views, to accept controversial facts, etc. when presented to them in terms of their own values, commitments, and frames of reference (see here, here, here, here, or here to get started on that literature).

Again, one reason why this works is because it lowers the stakes of the disagreement: it underscores that they don’t have to abandon their identity or commitments to work with you. However, it also shows good faith on your end — you value their perspective enough to actually put in some time to understand it, and to really listen and to seek out common ground.

#3 LEAD BY EXAMPLE. MODEL CIVILITY, FLEXIBILITY, INTELLECTUAL HUMILITY, GOOD FAITH IF YOU WANT OTHERS TO DO THE SAME

Basically, it boils down to the golden rule. In a good-faith conversation both parties should be alive to the possibility that they may be wrong – in part or even in full – and both parties should enter prepared to change their minds. It is unreasonable for you to expect or demand that they change their mind in response to arguments/ proof, if you are not sincerely prepared to do the same. A good exercise to ask yourself sometimes is “why do I believe this? What would cause me to change my view on this? What don’t I know about this topic that might be important?” If you don’t think there is anything that can cause you to change your position on a topic, this is a sign that you might not be engaging in good faith. Put another way, ask what you want out of this conversation. Be honest with yourself: if the goal is not to follow the truth wherever it leads, but instead to advocate for a particular position, then the conversation is not likely to be as productive. And you may be depriving yourself of an opportunity to learn, grow and build relationships in the process. The reality of the matter is that most of us are not specialists on the issues we tend to argue with people about (and even if we were, it wouldn’t necessarily put us beyond legitimate criticism or doubt! Experts are people too, after all.) In most cases, at best we may have read a couple articles or something – and not scientific articles, mind you, but typically articles in the popular press. For instance, if you are trying to argue with someone on climate change, ask yourself – how familiar are you, really, with climate science? Have you read journal articles or scientific books on the matter? Have you taken courses on climate science? Here’s a fun fact: Climate change deniers tend to be, on average, more knowledgeable on climate science than those who embrace climate change as a serious threat. This too, is somewhat intuitive when you think about it: Most of us are perfectly happy to defer to the apparent scientific consensus. So we don’t read the literature, we just say, “well, the scientists believe this, and I trust them.” However, for those who are disinclined to trust the consensus position, and who strike an oppositional stance – they know they’ll have to justify it. They know their position is going to be unpopular, they know they’ll be bashed as ignorant science deniers. So they are more likely to actually read stuff – including actual scientific literature. They will be more motivated to get familiar with the big issues, identify apparent weak points/ gaps/ contradictions in the literature, to identify dissenters and their arguments. What then becomes frustrating to a lot of deniers is that they’ll engage with climate change die-hards, who clearly have less exposure than they do to actual climate science – but who talk down to them as though they know more. When very often, they don’t. The advice: don’t appeal to what “science says” in an argument if you are not personally familiar with that science. In fact, I suggest that you actually admit that you haven’t read the literature if you haven’t. It shows honesty, humility and good faith. Even tactically speaking, it can help shift the burden of proof: If your opponent claims that they have read the literature, and that’s why they hold the position they hold — great! Take the opportunity to ask them for some things to read from journals or other credible sources. If they were bluffing, it will become immediately obvious. But if they are serious, and do provide you with credible content, you should read it when you can and reply! After all, if you are passionate enough about this issue to argue about it, you should be passionate enough to do some extra research, right? At the very least, it will prepare you for future discussions with other people to be familiar with these sources and arguments. But you might even learn something! If a question is put to you that you don’t have an answer to, don’t be afraid to say, “I don’t know. That’s a good question. Let’s look into that together. Where do you think we should start?” This shows that you are sincerely open minded and engaging in good faith. It creates an opportunity to build out from shared space (if you both consult the same source), to continue the conversation, and to learn more about the issue at hand. Emphasize when your interlocutor makes a good point. If you learn something interesting through the conversation, say that! If they put forward a compelling idea you hadn’t thought about, acknowledge it! If you think your initial position might have been mistaken, own up to it. Don’t think in terms of scoring points. You don’t ‘win’ an argument by getting the other person over to your own side – you win by getting a better handle on the truth.

Don’t let your emotions get the best of you

One major contribution of Jacques Derrida, Michael Foucault, Judith Butler et al. has been to underscore the power that words and symbols can have over us. However, much of this influence results from us ceding power to them in the moment of encounter. We have some measure of control over if, and how, we allow words and symbols to impact us — and with discipline and practice, we can gain more. This matters because although emotions often do convey important information, they frequently mislead as well (just like all of our other faculties). Sometimes our initial emotional reaction is not the right one — as becomes clear with a little time and distance. Often our reactions are the result of us hearing what we want to hear, or otherwise misperceiving or misinterpreting a claim. In the heat of the moment, people can also use clumsy language that could (and otherwise would) be more careful or precise — but which need not derail a conversation. Asking “what do you mean by that?” or “why do you say that?” can often go a long way towards clearing up misunderstandings or defusing an initial threat response. Other times, of course, people are trying to get ‘under someone’s skin’ or put them off balance. In these instances it is especially important to be attentive to — and in control of — one’s emotions. Don’t take the bait! Keep focused on what matters, and try to steer the conversation in a more productive direction. Granted, his will not always be possible. If one’s interlocutor seems committed to engaging in bad faith, consider disengaging. Either way, one need not (and probably should not) give people the reactions they are angling for when they’re ‘trolling.’

In closing, however, I should warn against a peculiar danger that menaces those of us who study cognition or psychology. It’s the temptation to think something like this:

“I understand biases and am aware of them – therefore I have accounted for them or may even be immune to them.”

And its corollary:

Other people, however, are not so self-aware. This is the problem: they need to be made aware of how biased they are.”

There is an obvious analog in religion that may be worth drawing out a bit: In principle, a knowledge of one’s own weakness, fallibility and flaws should inspire people towards greater humility and grace towards others. However, in practice often a deep sensitivity to sin renders people more judgmental. We all know the type: “Rather than being attentive to my own shortcomings, I’m going to really rub yours in. Because at least I know I’m a sinner, you don’t seem to be sufficiently aware. If anything, I’m trying to help you by showing you how broken you are.”

This is precisely the sort of deflective mindset that must be guarded against at all times! Studying these phenomena doesn’t make us ‘better’ than anyone else – these mental tendencies apply just as much to ‘us’ as to ‘others.’ Indeed, I regularly catch myself slipping into less-than-ideal patterns of communication, especially in my private life or on social media — and then I (usually) try to correct course when I can.

In short: the only way that knowledge of these problems becomes helpful is through the actions we take in response: Actively seek out disconfirmation. Stay alive to the possibility you may be wrong. Try to see things from other people’s perspective. Or to stick with the religious metaphor, practice what you preach!

Share:

Get HxA In Your Inbox

Hx A June8215of246
Make a Donation

Your generosity supports our non-partisan efforts to advance the principles of open inquiry, viewpoint diversity, and constructive disagreement to improve higher education and academic research.

This site use cookies.

To better improve your site experience, we collect some data. To see what types of information we collect, read our Cookie Policy.