Don’t Like Online Outrage? Look Inward

Don’t Like Online Outrage? Look Inward

While social media platforms can employ algorithms and other tools to help improve the level of public debate, the best way to decrease outrage and polarization is for everyone involved to be responsible for their own online behavior, three Duke experts said Wednesday.

Speaking to journalists during a digital media briefing, the three scholars discussed civility, the powers and limits of big platforms like Facebook and Twitter, and the many misperceptions people have about those on the ‘other side’ of the political divide.

Watch the briefing on YouTube.

Here are excerpts:



On civility

“Civil disagreement is a human activity like any other. It requires practice to do it well. Simply, we’ve fallen out of practice. Sociologists will tell us we’ve sorted ourselves across the country into our own political groups. That means we have fewer opportunities to disagree civilly, and when we do … we are increasingly deficient in the intellectual virtues to engage in productive, respectful discourse. For instance, open-mindedness, intellectual charity, intellectual humility.”


On how to get back to where people disagree civilly

“It shouldn’t happen online. To disagree better, we need to practice in person, in the flesh and blood. We need to do it over a period of time, and we need to get to know these other people. We need to know their biographies. Where they come from. Their families. Activities outside politics. Build that relationship. Then when you come to discuss politics – and there’s disagreement, as there often is – it will be set against this greater context.”

“It needs to happen in person. And that’s hard to do. And social media obviously makes it impossible.”


On responsible online behavior

“Inevitably, it will come down to decisions of individual will. When (someone) is urging us to join in in the scapegoating of another person, to be part of the mob, that’s a choice we have to make. It’s a moral choice. We can either practice intellectual charity and humility, or not. That aspect is unavoidable.”


On what you do if someone is unwilling to accept fundamental truth

“Open-mindedness is a virtue up to a point. There are certain things about which we won’t say there are two sides, or both sides. In my class, we’re not going to give anyone a seat at the table who would dispute the fundamental equality of all humankind.”

“We can be firm. Open-mindedness requires us to listen to the facts. I try to put things in moral terms. Rather than shame the other side for giving in to the dissemination of false information, I ask them to put truth above victory. Value the truth more than seeing your side proven right, or saving face.”



On how getting out of echo chambers doesn’t actually moderate views

“A lot of the most popular ideas we have about how to solve political polarization on social media, there’s very little evidence to support them. The idea of stepping outside your echo chamber – most of us, myself included, thought taking people outside their echo chamber, exposing people to those on the other side, would increase moderation. When we studied this … we actually discovered that instead of becoming more moderate, people tended to become more extreme.”

“Many of us would like social media to be the kind of idealized public sphere where we have a competition of ideas and the best ones rise to the top. This is not what’s happening. We’re seeing increases in incivility, we’re seeing extremity and really unruly behavior. So the important question to ask is, ‘Why do we use social media?’”

“In an era of increasing social isolation underscored by things like the COVID pandemic, we’re increasingly realizing that social media is the primary tool to understand our place in society. When we think of social media as really a tool to create our identities, it explains how extremism can result from, say, social outcasts who are status seeking.”

“It also helps us explain why moderates are so invisible online. People have very little motivation to share their views if they’re only going to be attacked by extremists who dominate the platforms.”


On changing social media behavior

“I would love for all of us to spend more time together offline. But we need to recognize social media is here to stay.”

“The important question to my mind is, ‘How can we incentivize better behavior on social media?’ We spend so much time talking about the negatives – misinformation, algorithms and radicalization, echo chambers – these things are very hard to counteract. I think we have the most leverage by trying to generate a bottom-up movement, by trying to change user behavior.” 


On role of social media platforms to weed out falsehoods

“We need to ask, what incentivizes people to share fake news? We need to take the user’s perspective. What we find is right now, the way most algorithms are designed, they reward engagement. They reward people who share something that gets a lot of likes and a lot of comments. So it’s unsurprising that divisive content gets up-ranked in this way. So some of the extremists we’ve studied are out to get new followers by spreading misinformation.”

“So one thing I think social media companies can do is create better incentives for civil behavior.”


On people who don’t care about truth

“The first thing we need to do is to recognize some assumptions may not be true. There’s a widespread phenomenon called false polarization. We tend to exaggerate extremity with the other side and underestimate extremity on our own side. We’ve seen this over time in the U.S. and in other countries.”

“The scale of the problem might not be as big as we think it is.”

“We need to be careful not to make misinformation into a self-fulfilling prophesy.”

“There are moderates. You just have to find them.”


On convincing social media platforms and users to do better

“We were polarized long before social media came along. Probably a lot of us would agree social media seems to be heightening polarization to some degree, at least in some ways, but we were a very polarized nation before. Even if some of the actions that social media companies are often lobbied to take … we wouldn’t see a sea change in behavior right away.”

“There is a role for the platforms. But we need to be more introspective. We all have a role in producing this outcome. Each time we log on … we make a choice about what to engage with. … Why am I doing this? Why am I sharing this? Do I really want everyone to see my strong feelings on this issue?”

“A recent study indicates about 74 percent of tweets about politics are made by about 7 percent of users with disproportionately extreme views. What that means is we need more moderates to engage. Moderates who have nuanced positions on issues of our day.”


On popular political misperceptions

“Republicans have much more favorable attitudes towards immigrants as a group than many Democrats would realize. Democrats have much more favorable attitudes towards the police and rural people than most Republicans would realize.”

“Another really interesting study looked at misperceptions. It turns out the average Republican thinks the average Democrat is young, a minority and lives in a city. The average Democrat things the average Republican makes more than $200,000, are evangelical Christians and live in rural areas.”

“The irony of this kind of assessment is the average American – the average Democrat and the average Republican, is white, middle-class and lives in a rural or suburban area.”

“There are profound gaps. When we correct those gaps … we see appreciable gains in inter-group attitudes. So there is hope.”

“I don’t want to paint too rosy a picture. America is still deeply divided. But there is room for overlap.”



On using tech tools to weed out misinformation

“For dealing with the issues on the platforms, it’s really difficult to automatically remove bad content. A lot of that has to be done manually. The platforms currently go through this manual fact-checking process. So I don’t think it’s going to be an easy solution by any stretch to try and remove content that’s clearly false, in an algorithmic way. That’s a very challenging technical problem that I don’t think will be solved anytime soon.”

“There are other solutions to address polarization. On that front, the things we can do intentionally include more diverse content so people are looking at a wider variety of news articles for example. The ways the platforms are engineered can intentionally try to avoid this homogenization issue, avoid this echo chamber issue. But it is challenging.”

“It’s a big challenge and an open area of research.”


On lessening personalized media consumption

“On a personal level there are two main things you can do. You can go in and look at your settings. A lot of these platforms like Facebook and Twitter … you can go and look at your profile. You can see the tags you’ve been assigned and un-assign tags that you think aren’t relevant or are potentially polarizing. You are in control of the people you are following. You can go and look and see if you’re following people with a variety of opinions.”

“And if you’re seeking out news … you can choose what platforms you’re looking at to find news. So just considering a wide range of options and intentionally reading people you disagree with is a really useful exercise.”

“You have the technology side but also the personal side. And as a society there are things the platforms can do to make those tools more transparent, more available. Facebook can send out emails reminding people these personalization settings exist.”

“There’s not necessarily an incentive to do so because the revenue they get is often tied to personalized ads. So it’s a challenge.”


On using technology to lessen outrage

“One of the big challenges with trying to have a technological solution … is you have to be able to measure it. With social media, it’s clicks, likes, read time, duration on the website. Those things are very coarse. They’re not going to capture a lot of the nuance.”

“The real solutions lie at the personal level. We as a society need to figure out how we can personally combat polarization.”

“The problem is fundamentally social. We can get help, but at the end of the day, you’re the one clicking on news articles, you’re the one watching the news.”

“Ultimately, people change their own minds. We can help by providing a wide variety of content.”

“We can look at technological solutions to figure out what that menu is, what the variety of content is, but at the end of the day people are making their own choices. All we can do is help and encourage. Ultimately, it’s a personal choice.”


Meet the experts:

Christopher Bail
Christopher Bail is a professor of sociology, public policy and data science at Duke University, where he directs the Polarization Lab. He is author of the new book, “Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing.”

Allison Chaney
Allison Chaney is an assistant professor of marketing at Duke’s Fuqua School of Business. Chaney studies the development of machine learning methods and the impacts of these methods on individuals and society.

John Rose
John Rose is an instructor in the Kenan Institute for Ethics at Duke University, where he teaches classes on topics including political polarization, conservatism and happiness, and researches the tradition of virtue ethics.

Duke experts on a variety of other topics related the coronavirus pandemic can be found here.

Follow Duke News on Twitter: @DukeNews