Liberal and conservative thinkers seem to agree: online “echo chambers” are driving Americans further apart.
That now-ubiquitous term was first popularized by Cass Sunstein in his 2001 book Republic.com, which argued that by allowing people to personalize their information diets, the internet was fostering an extreme form of fragmentation that was harmful to democracy. Without a common set of shared experiences or consistent exposure to diverse viewpoints, reasoned discourse would become almost impossible. Eventually, he posited, our consumption of “the Daily Me” would lead to polarization and extremism as we wallowed in information feedback loops of our own making, demonizing opponents and reinforcing our own beliefs.
Sunstein and a like-minded circle of writers and intellectuals continued to refine the argument at a time when the term “cyber-balkanization” could strike a chord (and an AOL account was a primary means of getting online). Soon, others began looking around at the political landscape and made a more specific claim. In a post-Gingrich era of Congressional polarization, right-wing talk radio and, especially since 2000, the rise of Fox News, “echo chambers” came to refer to a particular form of conservative media bubble abetted by the rise of new technologies. Liberals frequently criticized this bubble, arguing that it was making the kinds of compromises that are necessary in our political system difficult to sustain. Illustrating how dire many felt the situation to be, critics soon began talking about “epistemic closure”
and the way in which the conservative media bubble had fostered a self-enclosed belief system – up to and including beliefs about factual matters.
Even more recently, attention has shifted to the role of social media in creating a new orthodoxy of “political correctness.” For some commentators, the culprits in this case are on the left: young activists harnessing the power of the retweet on behalf of an emerging brand of social justice. For others, it’s the hidden algorithms that match content to our preferences and identities, perhaps without our knowledge.
What if social media, far from encouraging extremism and groupthink, can actually lead to moderation?
Anyone who has followed the prevailing trends in political commentary over the past few years has encountered some form of these arguments. But what if these concerns are exaggerated? What if, on average, most people have relatively balanced encounters with political content? And what if social media, far from encouraging extremism and groupthink, can actually lead to moderation?
In a relatively short period of time, a new wave of scholarship on these questions has upended the conventional wisdom on echo chambers and ideological bubbles. What this research shows is that, first and foremost, most people are not political junkies: The people who constantly refresh the Drudge Report and tweet out favorable poll results are not like you and me. And while social media certainly has the potential to let people construct a homogenous information cocoon for themselves, more often than not it does the opposite by exposing people to unexpected viewpoints. The ways that researchers came to these conclusions show how social science is increasingly turning to large datasets and clever research designs to unearth how media consumption habits play out in the real world.
More often than not social media exposes people to unexpected viewpoints.
The first clue that the “social” in social media might not operate the way its critics assume came in a 2012 study
by Solomon Messing and Sean Westwood. Their insight was that people don’t merely look to partisan cues – say, an “MSNBC” or “Fox News” logo – to determine whether a news article is worth reading. It also matters who else is reading it. The researchers created a mock version of Facebook, complete with real news articles, and presented it to subjects in an online interface. The articles were shown either alongside partisan cues or an indicator of the number of people who “recommend” the article – much as you’d see with the number of “likes” or shares of an article on social media. Other research subjects saw articles with both types of cues. The result? People’s choices of what to read were a function of social
rather than partisan
considerations. What’s popular swamps what you think you’ll agree with. And since people’s online social networks are often more ideologically diverse than they realize, the result of these dynamics is that both liberals and conservatives are exposed to more challenging content on social media than is often implied by surveys of media preferences.
People’s choices of what to read were a function of social rather than partisan considerations. What’s popular swamps what you think you’ll agree with.
Something similar seems to be happening on Twitter. Pablo Barberá, now an assistant professor at the University of Southern California, collected millions of tweets
and used people’s follow patterns to estimate their ideological positions in relation to those of high-profile political figures in their social networks. (If I follow Barack Obama and Hillary Clinton, I’m probably to the left of someone who follows Ted Cruz and Ann Coulter.) This is a powerful method: More than just another social media metric, it can predict important offline behavior, like party registration. Over time, Barberá found, most people’s networks actually tend to become more moderate
over time, meaning that they are increasingly exposed to more diverse information. Only for people with the most homogenous networks to begin with – a small fraction of the population – did he find evidence of increasing polarization.
What seems to be going on is that Twitter and other social networks foster weak ties – friends of friends, or followers of followers. The ideas and arguments shared or retweeted by this wider network tend to be more diverse than the ones in our immediate circle. And if we see something that makes us think twice, we’re more likely to actively choose to see similar content in the future – for instance, by following the source directly. Tweet by tweet, our networks of strong ties become more diverse.
Yet as compelling as the evidence in such studies (and others) is, the echo-chamber narrative is irresistible
. An interactive feature
from the Wall Street Journal
illustrates two hypothetical Facebook “news feeds” using the content from actual partisan sources. As expected, the left-wing feed about Hillary Clinton is full of pieces trumpeting her latest gains in swing-state polls, while the right-wing feed is full of leaked audio in which Clinton purportedly insults supporters of Bernie Sanders. Missing from both? According to the methodology, “content … shared by Facebook users more broadly across the political spectrum,” including … The Wall Street Journal.
There are hyper-partisans out there who only read about politics from their preferred perspective, constantly seeking out reinforcement. There just aren’t that many of them, and they drive a hugely disproportionate amount of traffic to partisan news sites.
This is a problem because most people aren’t consistent partisans, and even the strongest partisans don’t have uniform preferences for political information. My own research
illustrates this point. I was able to combine data tracking people’s web visits with survey responses about their political preferences. It turns out that the vast majority of participants had relatively centrist media diets: They tended to visit CNN.com or MSN.com, which are at the center of the political spectrum, much more than left-of-center sources such as The Huffington Post – regardless of whether they are liberal, moderate, or conservative themselves. One reason for this pattern is that people often rely on bookmarks, “most visited” lists, and other shortcuts to learn about the world around them. Another reason is that there are
hyper-partisans out there who only read about politics from their preferred perspective, constantly seeking out reinforcement. There just aren’t that many of them, and they drive a hugely disproportionate amount of traffic to partisan news sites. Sure enough, if I zoom out to look at domain-level statistics, I find a huge spike in the number of visits to conservative websites like Breitbart and The Blaze – the “echo chamber” we’ve all read about. But those visits are driven by just a handful of people
in my data.
Polarization is real, and it is true that dedicated partisans can have outsize influence (especially on social media) regardless of their number. But social media may be part of the solution rather than the source of the problem.
A central irony of concerns about echo chambers is that they span the political spectrum. The Wall Street Journal
interactive was widely shared on Twitter and Facebook, crossing my feed via the furrowed brows of both liberals and conservatives in my network. To be fair, they’re not wrong that there is something to worry about: Polarization is real, and it is true that dedicated partisans can have outsize influence (especially on social media
) regardless of their number. But social media may be part of the solution rather than the source of the problem.