Improving Civil Discourse Online
Jon Garfunkel
Sep 2024
Recently, the Berkman Klein Center’s Applied Social Media Lab held a webinar entitled “Beyond Discourse Dumpster Fire” about the problems perceived in online conversations. The premise of this was spurred by a research paper by Bursztyn et al (“When Product Markets Become Collective Traps: The Case of Social Media”) which asserted that “large shares of consumers are trapped in an inefficient equilibrium and would prefer the product not to exist” -- and also perhaps anecdotal observations about the toxicity of the online world.
The Bursztyn research paper looked at college students assessing Instagram and TikTok; it did not look at more text-based deliberative communities, such as Facebook, Slack (“discourse” is not in the paper). Additionally the webinar considered a separate research paper Durably reducing conspiracy beliefs through dialogues with AI. Good news: it worked.
The solution from ASML, however, was in a completely different direction. The future, to them, is...
Face to face conversations!
The lab acquired the technology behind Living Room Conversations and launched a website called frankly.org.
Ditching the virtual realm for the physical is nothing new. In local elections, I’ve observed candidates forgoing online engagement altogether, by declaring to be “too nasty” or some other dismissal. Additionally, many civilians already devote much of their civic energies to doing just that: board meetings, religious service attendance, PTA volunteering, Rotary club lunches, poker nights with friends. People go online to reach out to people casually without a commitment.
Certainly, many people still get enough value out of social media to continue to use it. The numbers are astonishing - 37% of the entire world is on Facebook and 70% in the US. How many are active in groups? A Pew research survey --from the simpler times of 2011 -- sought to assess the role of digital technologies in civic groups. “Some 65% of those who are social network site users say they read updates and messages on these sites about the groups in which they are active and 30% say they have posted news about their groups on their SNS page.” I would expect these numbers to be nearly the same or rise further.
So we need to look at the behavior of adults in real-world communities which use Facebook groups. These are the questions we might ask:
How much value are people getting out of the group?
What specific sort of toxic behaviors exist, and how are they dealt with currently?
What could be improved to get more value out of the group?
I’ve proposed these sorts of questions to digital democracy researchers over the years, including in the MetaGov community. It seems ripe for research.
I don’t have the means to do that research myself. But, in my years of being involved in discussion groups and social media, and administering a few along the way, I have some suggestions that at least ought to be considered:
As an ordinary user, I would like to mute individuals in a group, and I would like to publish my mute list so others can follow it. (Facebook appears to only have “block”, which prevents another user from seeing your content and you from theirs. Twitter had long ago introduced a “mute” feature for simple one-way blocking)
Distinguish vibrant conversations that are good faith from those that aren’t -- often called attention-seeking, outrage-baiting, engagement farming, etc. In Facebook, the more comments to a conversation, the higher it is ranked on the community page of “Newest activity”. Group administrators & leaders have no ability to “downgrade” such posts. (The easiest way might be for admins to respond, and close discussion with an explanation why they are doing so -- though many admins prefer a “light touch” and don’t want to block discussion)
Above all, find ways to guarantee the safety of people from being “ganged up on” -- public officials or dissenters. Maybe in certain heated exchanges between 2 people -- prevent people other than the 2 from joining and “piling on.”
Commonly the sense from many would-be social network reformers has been this:
They’re too big! It’s too hard to change them! They profit from engagement whether it’s good or bad! We can create our own platform instead -- and fight for identity portability so that we can easily recreate our communities in new spaces!
But this misses a number of things. First off, identity inertia isn’t so much of a burden that it once was. In local groups, you can easily invite people to join a community on another platform like Slack
Additionally there’s a perception that ordinary users are “powerless” vs the tech giants. That’s a bit of a dodge, for a couple of reasons. First off, politicians are eager to use Facebook; and Facebook has been under pressure (by both the left and the right) for their perceived imbalances on censorship, for perceived harm to minors, and other injustices. So there is some leverage the states could apply by making Facebook (and others) more enjoyable & safer experiences across the board.
Additionally, academic research centers and other policy institutes would be the very sort of organizations that are able to align advocacy. Consider NYU’s Jonathan Haidt, who has for years built up a body of research about the harm that social media and cell phones have on adolescents. He co-founded one movement (Let Grow) and has inspired others (OK to Delay) through a sense of mission and purpose. They are, of course, targeting parents & school districts & governments -- but this indirectly will influence big tech. If we see toxic online discourse as a serious problem worth solving -- we should be working with research institutes to press for ways to fix it.