LIVE EVENT: Measuring Campus Expression

Join HxA and FIRE for this live discussion | July 24, 3-4pm ET

Register
Heterodox Academy
Back to Blog
39 campus speech
August 16, 2018
+Musa al-Gharbi
+Campus Climate

Vox’s Consistent Errors on Campus Speech, Explained

The Free Speech Project (FSP), based out of Georgetown University, attempts to document “incidents in which Free Speech has been challenged or compromised in recent years, and collect analysis from various points of view of the struggle to sustain First Amendment Values.”

This is a great initiative. In fact, Heterodox Academy recently ran a post emphasizing the importance of projects like this: if we want to understand the scope of the problem – and how it is evolving – we need to be looking at behaviors, rather than whether or not people say they support free speech in polls. Many who strongly support civil liberties in the abstract are perfectly happy to declare “states of exception” against those they detest in the “real world.”

So far, FSP’s “Free Speech Tracker” has pulled together 137 incidents of (attempted) free-speech suppression. Nearly 2/3 of these (88) occurred on college campuses.

Surveying just over 90 incidents from this database in a recent Medium post, FSP Director Sanford Ungar came to the following conclusions:

  1. Most of the incidents on campus do seem to target people on the right.
  2. However, of those targeting right-leaning speakers, the majority of incidents seem to concern the same four people: Milo Yiannopolis, Ben Shapiro, Charles Murray and Ann Coulter.
  3. Many on the left were also targeted in campus incidents.

Self-consciously speaking just a little beyond his data, Dr. Ungar suggests that were conservatives to focus more on inviting right-leaning intellectuals rather than visitors who have “built a brand of disruption” then it is likely they would face decreased incidences of suppression. He also posits, regarding the myriad laws being passed to help “fix” institutions of higher learning – often these “cures” seem to be worse than the “disease” they are trying to address.

Many in Heterodox Academy share these sentiments. In fact, HxA Research Associate and NYU Law student Nick Philips has argued both of these points in recent publications: campus conservatives must check their own trolling; attempting to legislate away the socio-political tensions within universities is probably a bad idea. FYI: Nick is a conservative. These do not have to (and should not) be partisan issues.

A few things are critical to note here:

Dr. Ungar’s analysis had nothing to say about the overall prevalence of these incidents. Its purpose was just to look at who is targeted and under what circumstances. Even for this very narrow set of questions, he emphasized that his analysis was “preliminary,” “not necessarily” representative, and that “much data collection and analysis remain to be done.” Additionally:

“The selection of incidents documented on the Free Speech Tracker is by no means comprehensive and never will be. The results thus far are anecdotal, and despite efforts to avoid it, there could be a selection bias in the incidents chosen. And the classification of people whose expression is compromised as being on the ‘right’ or the ‘left’ may well be subjective and arbitrary.”

In light of these realities, it would seem absurd to infer anything about how common free speech incidents are overall on the basis of Dr. Ungar’s preliminary data… Yet that’s exactly what Vox recently tried to do.

What does “the data” say?

On August 3rd, Vox Senior Reporter Zack Beauchamp published the following story:

The essay drew from two sources: Ungar’s aforementioned Medium post, and a recent essay by Acadia University political scientist Jeffery Sachs discussing faculty firings.

With regards to the former, he said:

Remember: Dr. Ungar selected a sample of about 90 incidents from his full dataset (137 incidents). Within this sample, there were 60 incidents that happened on campus. 60 was not the total number of incidents on campus – not even in the FSP dataset, let alone in the country as a whole. And Dr. Ungar cautioned against making strong inferences on the basis of his data, even for the questions he was trying to answer – let alone for totally unrelated questions his data do not speak to (such as, “how prevalent are these incidents?”).

Nonetheless, Beauchamp concludes (on the basis of what seems to be a fundamental misunderstanding of the data), that there really is no free speech problem at universities. This is a strange takeaway, given that Dr. Ungar himself reached basically the opposite conclusion in the very report that Beauchamp based his story on:

“A preliminary analysis of more than 90 incidents… offers evidence that speech and ideas are being suppressed across the political spectrum, and not only on college campuses, but also in U.S. civil society at large.”

Why did Dr. Ungar draw such a starkly different conclusion? Because he understands that the 90 incidents he surveyed, and even the full 137 in the FSP database, are just scratching the surface on the total number of incidents nationwide. Again, he repeatedly emphasized that the FSP database was nowhere near comprehensive (indeed, he was analyzing preliminary data for the project in his post).

One could compile hundreds of additional incidents on college campuses from the last few years – starting with the sources Beauchamp refers to in his own essay:

Sachs’ database on faculty firings includes 58 incidents to date, some of which are not captured in the FSP database. Heterodox Academy’s Guide to Colleges includes myriad (good and bad) incidents from the U.S. News & World Report “Top 150 Universities” and “Top 50 Liberal Arts Schools.”

If he really wanted to get a sense of the prevalence of these incidents, Beauchamp could then turn from FSP to other prominent free speech organizations that do work on campus, such as PEN America, or the Foundation for Individual Rights in Education (FIRE).

Let’s just drill into the latter for a moment: FIRE has a database on campus disinvitations containing 137 incidents just since 2014 (which would nearly double the FSP incidents on its own) – and nearly 400 disinvitations overall. Additionally, they have a list of literally hundreds of legal cases they were/ are engaged in – primarily involving violations of 1st Amendment rights of faculty or students. And of course, they only have the capacity to take up a fraction of the cases out of all those who reach out to them (FIRE’s president, Greg Lukianoff, said they typically get more than 1k direct requests for help in any given year).

Editor’s note: Upon review, Beauchamp did mention FIRE’s disinvitation database in his original post — although his use of this data was also problematic. And the broader point about underutilizing FIRE’s resources stands. More details here.

And of course, all of us are only looking at situations that involve terminations, make the news, or end up in court, such that Sachs, HxA, FSP, FIRE or PEN America could pick it up in the first place. We are not able – even collectively – to capture all publically-available incidents. We will never be able to capture other, likely far more prevalent, incidents of suppression of speech or ideas that do not end up in major media outlets, in courtrooms, etc. As a result, the default assumption should be that the problem is likely worse than the available data suggest (maybe not by much… but also, maybe by a lot).

In short: Beauchamp apparently failed to grasp that Dr. Ungar’s Medium post focused on a subset of the total FSP data. He also misread the purpose of the essay, making inferences about the overall prevalence of free speech incidents on campus which the data did not speak to at all– and made sweeping claims on the basis of this data despite Dr. Ungar himself advising against this. Beauchamp did not give due diligence to the phenomenon he was trying to explain (i.e. the overall prevalence of campus incidents nationwide) – indeed, he did not even make full use of the sources he cited in his own essay, let alone doing basic research beyond.

And that’s just the first study mentioned in the piece. Representation of the second was no better:

Who is more likely to be fired?

Beauchamp transitions as follows:

Two points:

With regards to not putting “too much weight on the Georgetown findings,” this is 100% correct for the questions the FSP analysis was trying to answer (again, “who is targeted?” And “under what circumstances?”). But we cannot put any weight on Dr. Ungar’s analysis if we are trying to understand how common these incidents are overall. Again, the FSP data simply do not speak to this point yet – and Dr. Ungar made no suggestion that they do.

Now, on to the second study: Sachs’ data do not show that “left-wing professors are more likely to be dismissed for their speech than conservative ones.”

His data do show that most professors fired for political speech seem to be on the left. So one could say, for instance, something like, “a professor fired for political reasons is more likely to be on the left than the right.”

However, this is a very different statement than what Beauchamp did say, which was “left-wing professors are more likely to be dismissed for their speech than conservative ones” (his emphasis).

The difference between these two statements should be very clear, because it is a very important difference. In fact, Sachs’ data show the opposite of what Beauchamp claimed: conservative professors are actually more likely to be fired for political reasons.

There is about a 10:1 ratio of liberals to conservatives in social research fields, which most of the professors in Sachs’ data seemed to hail from. But, for the sake of charity we can go with the left-right ratio of the academy overall, which is a little better (roughly 5:1).

And even if we just restrict ourselves to 2017, which was the year most radically skewed against the left in Sachs’ data (i.e. the most convenient year for Beauchamp’s thesis), the ratio of left-to-right firings was 19: 6, or roughly 3:1.

Therefore, even being as generous as we can be in terms of the fields we include (all) and the years we look at (restricting to 2017), we see that a conservative professor is, on average, nearly twice as likely to be fired for political speech than a liberal professor is – in the very data that Beauchamp cited.

To his credit, this reality is not lost on Sachs (who is, in fact, pretty great). In the very essay Beauchamp links to, Sachs advises:

“…the professoriate leans significantly to the left as well, so we should expect left-leaning speech to make up the bulk of terminations. As with the skewed findings of FIRE’s Disinvitation Database, we are not talking about a population where political ideology is uniformly distributed. It is possible for liberals to constitute the majority of faculty terminations and also for conservatives to be terminated at an equal or higher rate.” (Sachs’ own emphasis)

In short, it is simply not the case that “left-wing professors were more likely to be dismissed for their speech than conservative ones.” Sachs does not argue that. His data do not show that.

Two studies were mentioned in the piece, neither had their conclusions accurately represented (advice for readers: if someone in the media is making bold claims on the basis of some study, read the original study. It’s probably being exaggerated and misrepresented… or else likely suffers from the “piranha problem“). But unfortunately, we’re still not done:

Heterodox Academy is… what?

In the opening paragraph of Beauchamp’s essay, HxA is described as an “online magazine” that is “premised on the idea that political correctness is sweeping the American university, threatening both higher education and the broader right to free speech.”

Literally every single component of this description is wrong. I’m going to dwell on this for a bit, because caricatures like this are a common way that certain people try to discredit our organization and dismiss our cause.

Let’s start with the fact that we’re not an online magazine. We’re an academic consortium of more than 2k faculty, grad students and (now) university administrators committed to increasing viewpoint diversity, mutual understanding and constructive disagreement in institutions of higher learning. This description is literally the first thing people are confronted with if they follow the link that Beauchamp himself provided:

So it is curious that he would get this wrong.

What do we do, specifically? We carry out fundamental research, develop tools (Open Mind, Guide to Colleges, Campus Expression Survey) and provide platforms (blog, podcast, conferences) to better understand and address problems facing universities – specifically those resulting from insufficient viewpoint diversity, constructive disagreement, or mutual understanding.

We are not “premised” on fighting political correctness. We came about after a sociologist (Chris Martin), a law professor (Nick Rosenkrantz) and a set of psychologists (Jon Haidt, Lee Jussim, Phil Tetlock, Jose Duarte, Jarrett Crawford + sociologist Charlotta Stern) published independent-but-roughly-contemporaneous papers about how ideological homogeneity and insularity were undermining work in their respective fields. After discovering each-other’s essays, they started corresponding and decided: rather than tackling this problem independently in psychology, sociology and law — why don’t we pool our resources and efforts to improve the quality and impact of social research more broadly? So began HxA — in the summer prior to the waves of unrest that would sweep through elite U.S. colleges throughout 2015.

In response to what seemed to be growing divides — both within campuses and between academics and the general public — we expanded our mandate to include effective teaching, learning and engagement as well. However, we have always been focused near-exclusively on institutions of higher learning.

In contrast to groups like FIRE, PEN America or the ACLU, we are not at all concerned, as an institution, about the “broader right to free speech.” Nor are we trying to advance free speech or viewpoint diversity for their own sake. As an organization, we view both as instrumentally good (in the service of improving quality and impact of research/ pedagogy). Hence, we’ve repeatedly emphasized that there are appropriate limits to both free speech and viewpoint diversity within the academic context.

Frankly, “political correctness” is not even a big focus of our institution. So far in 2018 we’ve run one essay on the topic: It was a straightforward summary of a peer-reviewed psychology study attempting to empirically measure political correctness and its relative prevalence.

In short: nothing in Beauchamp’s description of HxA was accurate. Giving a benefit of doubt, we can assume this misrepresentation was borne out of ignorance… but even this charitable assumption would suggest a lack of due diligence in investigating and fairly-describing an organization that he planned to position as the counter-point – or in this case, “straw man” — for his essay (Beauchamp’s colleague, Nicole Hemmer, did a much better job in explaining who we are and what we’re about here, for instance).

0 for 2

What makes Beauchamp’s essay particularly disturbing is that it follows hot on the heels of another widely-circulated Vox report on this very issue, which also suffered from serious problems. In March, Matt Yglesias published the following:

Strong claim, right? What was its basis? Basically, Yglesias looked at data from the General Social Survey (GSS) showing that contemporary young people are actually more likely to allow a platform for militarists, communists, racists, homosexuals and anti-theists than any previous cohort.

He interpreted this to mean that contemporary youth are more tolerant of… basically everyone… than they ever have been before. The argument basically runs:

  1. People have said contemporary college kids are unwilling to listen to perspectives they disagree with.
  2. The GSS shows young people today are more likely to allow homosexuals, communists, militarists (people who support military rule over democracy), and anti-theists (people who reject religion) to speak.
  3. Therefore, kids these days are, in fact, more tolerant.

Here’s the problem:

Given that what we are trying to test is whether-or-not kids are willing to listen to people they disagree with, making the leap from the second premise to the third requires an assumption that contemporary youth disagree with militarists, homosexuals, communists and anti-theists at roughly the same levels as they did in 1972 (such that “increased respondent willingness to let members of x group speak” can be confidently equated with “increased respondent willingness to listen to views they find personally disagreeable”).

But we cannot make this inferential leap because the requisite assumption is false:

Contemporary young people are much more likely to champion LGBTQ causes (for instance, roughly 3 out of 4 Millennials support same-sex marriage) – and as reported by Yglesias’ own publication, they are significantly more likely to identify as LGBTQ themselves — than previous cohorts of youth. Hence contemporary young adults are significantly more likely to simply agree with a homosexual speaker.

Similarly, young people today are more supportive of communism and socialism (hence more likely to *agree* with a communist speaker) than previous cohorts. They are less trustful in democracy and liberalism—and more receptive towards a military coup – than any other generation of living Americans, or even any other previous cohort of young people on record (hence more likely to *agree* with a militarist). They are less personally religious, and more skeptical towards organized religion than any other contemporary or previous cohort (hence more likely to *agree* with an anti-theist).

This is an obvious confound:

If we are trying to use “allowing x person to speak” as a longitudinal measure of tolerance, then it is necessary to account for shifting views about the causes these speakers are ostensibly advocating — or else we could just be measuring changing attitudes on the issues, rather than rising/ falling levels of tolerance.

And just to be clear, we are just running with the conventional usage of the word here (which is, mercifully, the same as the academic meaning): tolerance is a willingness to “put up with” views or behaviors that one finds annoying, offensive, ridiculous, dangerous or even outright harmful.

In 1972, homosexuals, anti-theists, militarists and communists were groups that fell firmly into the category of those who need to be “tolerated.” Today? Not so much.

Indeed, given the extent to which views on these issues have shifted among young people since 1972 (or for that matter, even since 2002), it is likely that the increased willingness to allow homosexuals, anti-theists, militarists, etc. to speak is entirely a product of young adults growing more sympathetic towards these speakers, rather than more tolerant of them.

Again, this is not a linguistic quibble, it is a substantive point:

Remember, Yglesias argues that contemporary youth are actually more willing to put up with views they disagree with than they have been in the past. In order to actually establish whether this claim is true or not, we have to look at groups that contemporary young people are equally or less sympathetic towards than they have been in previous years. If we observe an increase in willingness to let these people speak, it can legitimately be interpreted as an increase in tolerance.

Let’s give it a shot:

On average, kids these days don’t seem to be into “anti-American Muslims” any more than they were a decade ago (in fact, about a third hold extremely negative views on Islam or Muslims). Are they any more tolerant of “anti-American Muslims?” on the GSS? Nope! As Yglesias points out, the same majority would still refuse to grant them a platform.

We can try again:

Contemporary young adults are significantly less likely to endorse “racist” views than any other U.S. age cohort. Well, are they more likely to give the racists a platform? No. They are less willing today than they ever have been to allow it. This is actually far more significant than it may initially seem — because the sphere of what counts as “racist” has also radically expanded – from David Duke in the 70’s to things like “microaggressions” today. In other words, not only are contemporary youth more willing to censor those they deem racist than previous cohorts, but they are likely to brand a much wider range of speech as “racist” (and therefore, worthy of censorship).

And so, for the only two groups in the GSS where we have not seen radical gains in sympathy among contemporary youth (anti-American Muslims and racists) – we see that willingness to allow platforms to these speakers has held steady or declined.

Given these realities, the graphs Yglesias presented don’t actually provide evidence for the argument he’s trying to make. If anything, these data provide evidence that kids today may be growing less tolerant of those they actually disagree with (April Kelly-Woessner demonstrates this, using the exact same GSS variables, in a book chapter summarized here).

The same holds for the other effects Yglesias reports:

The GSS shows that liberals are more “tolerant” than others, right? Not so much when one considers that people on the left are (obviously) more sympathetic to communism than people on the right, more likely to identify as atheists, to identify as LGBTQ or to champion LGBTQ causes, etc. The fact that liberals are more willing to allow homosexuals, anti-theists, communists, etc. to speak indicates nothing about their tolerance towards actual political or ideological opponents.

Ditto should one control for the differences in political orientation, religiosity and the like between more educated v. less educated Americans (indeed, there is evidence that highly-intelligent and educated people tend to be significantly more ideological, and more ideologically intolerant, than others. See here).

Across the board, were one to control for the fact that contemporary youth, liberals, and educated people are far more likely than most Americans to personally sympathize with the “controversial” views measured by the GSS — and even more so now than they have been in the past — virtually all of the effects upon which Yglesias’ argument is based simply evaporate. Indeed, as April Kelly-Woessner and others have shown, “the data” may even end up trending in the other direction.

The fact that his article is premised on such a big oversight is striking given the very bold claim Yglesias is trying to make on the basis of this data (that “Everything We Think About the Political Correctness Debate is Wrong”) – and ironic in light of his conclusion that the “PC debate would benefit from more facts and rigor.”

Indeed, it would.

Conclusion

Beauchamp and Yglesias’ essays don’t provide meaningful evidence (let alone knock down proof) that there is no free speech crisis at universities. But of course, this does not entail that there actually is such a crisis. In order to even have that debate, there has to be clarity on what is meant by “crisis” to begin with: a crisis in what sense? For whom? In virtue of what? It often seems as though people are talking past one-another on this point.

Here is what we can conclude from the essays surveyed here:

Coverage on campus speech by Vox writers seems to regularly suffer from bias. In particular, data are interpreted in such a way as to advance authors’ preferred narratives on the issue (i.e. “it’s largely a Republican hoax”). There is disregard for apparent confounds, and occasionally, glaring errors in the presentation and analysis of the cited data.

If this sounds like some kind of sweeping condemnation of the publication as a whole, I don’t mean it to be. Ezra Klein and his team are bright people who do a lot of good work on a range of issues.

It may be that explicitly partisan and advocacy-oriented sites like Vox (or Weekly Standard) are more susceptible to these kinds of errors than publications like The Atlantic, or New York Times who (however imperfectly) at least try to provide a platform for writers, and to cultivate readers, from both sides of the political spectrum. But the difference is probably not huge: on balance, mainstream U.S. media suffer basically the exact same kinds of bubbles and distortions as academic social research fields. And they’re suffering the same kind of credibility crisis as a result of their perceived partisanship (for instance, compare these recent Pew polls on attitudes towards universities v. the media).

Perhaps then, these problems can be addressed in a similar way. Maybe what we need is something like Heterodox Academy for the media:

A coalition of journalists, pundits and analysts – at different media institutions (and different types of media institutions) – who hold different ideological and political commitments, but are united by a shared dedication to 1) engage in good-faith with those “on the other side” of issues they care about, 2) adhere to disciplinary best-practices even more rigorously for issues they are personally invested in and 3) follow the facts wherever they lead and whatever the implications for one’s own “tribe.”

Critically: this does not entail abandoning one’s identity or commitments. For instance, a lot of my recent research explores how academics and analysts have tried to describe and understand Trump. Many clearly detest him. However, this should not have stopped so many from recognizing that he had agood chance of winning in 2016, nor should it prevent them from spotting and criticizing bad research on his supporters today. Indeed, precisely those who don’t want Trump to win again should be most ruthless on this issue – constructing the best argument possible for the conclusion they least prefer.

Maybe someone at Vox could try to something like this for their next essay on campus free speech. I bet it’d be a really great piece.

Share:

Get HxA In Your Inbox

Hx A June8215of246
Make a Donation

Your generosity supports our non-partisan efforts to advance the principles of open inquiry, viewpoint diversity, and constructive disagreement to improve higher education and academic research.

This site use cookies.

To better improve your site experience, we collect some data. To see what types of information we collect, read our Cookie Policy.