Polarization and social media : how it works and what we can fix
Dr. Lisa Schirch lays out how algorithmic extremism accelerates polarization, but reworking the design of the platforms can flip them to build cohesion instead
Below are lightly edited extracts from a February 2024 episode of the ‘Society Builders’ podcast, hosted by Dr. Duane Varan, interviewing Dr. Lisa Schirch.
Social media is a lot like a coliseum, where there's gladiators fighting in the center. Most people on social media are just watching. I think it's often just 1% of people on social media that are behaving really badly, arguing with each other in a very dehumanizing way, but it's contagious.
So while many people might not be arguing online, it's affecting people and what they think of the world online. It's the design of social media platforms as gladiator arenas where people come to fight, and the whole design of how it helps people watch that fighting.
We can design social media in a different way. It doesn't need to be amplifying those fighting people by putting them on the stage in the middle of all of us.
Companies make more money the longer each of us stays on their platform, because they show us more ads, the longer we're there. They get financially rewarded for keeping us there. What keeps us there are the fights and the arguments, and emotional content. False and deceptive and hateful content keeps people there longer. This is the neuroscience part of it. It's often referred to as the attention economy.
Politicians in Europe actually figured this out, that if they just posted a regular campaign ad, they would not get very much engagement. But if they used inflammatory language, very emotionally engaging language, then the algorithm on Facebook would show that to lots of people. And so politicians are like, you're making us more polarized, because you've incentivized us to be outrageous in our political ads.
We call it algorithmic extremism, algorithms that reward extremist content. It's turning up the heat in all of our conversations.
We cannot solve problems when we're all completely emotionally engaged. How do we move people to the frontal cortex, to the front of their brain, where they can solve problems, where they can think rationally and make sense of complex information?
This is degrading the IQ of humanity, and it's making it harder for us all to solve other problems. Migration, climate crisis, pollution, water shortages, all the many things that are facing societies all over the world.
When I am mediating between two people who disagree, the very first step is asking each person to share the experiences that they've had that led them to the current conflict, so they each tell their own story.
We have sort of skipped this on social media. Some of us really had really important experiences that have led us to shift beliefs.
Social media platforms should be reminding us not only of the rules, the edge of the road, but the middle, the norm. What do we aim for? What do we aspire to here?
These could be to learn from each other, to share experiences, to understand different points of view. If we had pop ups reminding us of these things, I think that it would be better. Like, can you share a little bit more about your story?
I use that line all the time on social media where nobody has maybe offered their story, but I'm inviting them. So I'm asking questions. I'm showing curiosity and why someone else believes something differently than I do. And that asking questions can also really take down the tone of a hostile and disagreement
We need to be involved, and we really need to be engaged with the tech companies to express, we want something different. We want a different product that serves humanity and serves our societies.
And we need to let our governments know they need to be involved, too. There need to be tax credits for companies that produce social cohesion, help societies hold together, and there needs to be taxation of companies that are divisive so that we actually have then a whole incentive structure, creating new tech tools that help society make decisions together.
Some of these platforms, like Remesh and Polis, help people make decisions together. They incentivize where there is common ground. They help people see each other's point of view and really listen to each other.
At scale, we could be building the kinds of design affordances that are in Polis and Remesh into all of our social media to ensure that every conversation brings out the best of humanity, so that we're learning from each other.
We don't have to have perfect harmony, but we have to be able to appreciate the humanity of the other. That's what I'm working for, designing a social media that will enable us to humanize each other, to continue enjoying all the good things about social media, but adding to it a benefit to society, a benefit to holding us together.
Dr. Lisa Schirch is a professor at the Keough School of Global Affairs at the University of Notre Dame, and Senior Research Fellow at the Toda Peace Institute, where since 2015 she has been leading their technology and peacebuilding program. She is a Co-Chair of the Council on Tech and Social Cohesion. The author of 11 books, Lisa’s current research focuses on the role of technology in strengthening social cohesion, including advocating for pro-social design governance. Her most recent book is ‘Social Media Impacts on Conflict and Democracy: the Tech-tonic Shift.’