Giant step forward for content moderation

…and therefore us (or all of us who use social media)

I’m certainly not the first to quote her, but yesterday Evelyn Douek at Harvard Law School tweeted, “Americans want platforms to be places of open expression, but also misinfo removed, but also tech companies have too much power & can’t be trusted to make content decisions, but also the gvt would be worse.”

Exactly. Making social media a safe place for everybody while protecting the freedom of expression of everybody, including those whose “free speech” spells harm to others is a little like “A Note from Your University about Its Plans for Next Semester”…

“After careful deliberation, we are pleased to report we can finally announce that we plan to re-open campus this fall. But with limitations. Unless we do not. Depending on guidance, which we have not yet received…. In this time, more than ever, it is time for strong, decisive action…. It is our decision to delay our decision so we can decide on our decision at a later decided time.

Whether we’re talking about Covid-19 or the Internet, making the right decisions, decisively and right now, and decisions that serve everybody’s wellbeing simultaneously is, so far, that “riddle wrapped in a mystery [and so]” of which Winston Churchill spoke.

Which is a long way of saying social media content moderation is just plain hard. It has also long been a puzzle for the open-expression-vs.-misinformation reasons spelled out by Douek as well as the fact that, too often, one person’s “free speech” is another person’s harm (often the person at which that “free speech” is directed, of course). Not to mention simply sifting outright hate speech, trolling, harassment, etc. out from civil or neutral speech when algorithms really struggle with that.

Enter the Trust & Safety Professional Association, launched today! TSPA is a membership organization designed to support the content moderation community, the people all over the world who do that hard work of setting and enforcing content moderation policy for the social media platforms.

This is a great development for the field and everybody. TSPA will provide support and training for Trust & Safety people and—through its new research arm, the Trust & Safety Foundation Project—education the public about the work of Trust & Safety, including content moderation by employees and outsource firms around the world. I mean, social media has now been around for at least a decade and a half, with content moderators laboring way too much in the background. So this is both major progress and way overdue.

A global first at a crucial time

Based in San Francisco, TSPA was conceived in Washington, D.C., where it was developed largely by the Internet Education Foundation (IEF). It’s an outcome of a series of content moderation conferences called CoMo at Scale in the US and Europe that for the first time in social media history gathered Trust & Safety leaders at multiple social media services so they could—with each other, user advocates, policymakers and the news media—talk about their platforms’ rules and practices. This was unprecedented. I went to the May 2018 one in Washington, D.C., organized by the IEF, and was struck by both the level of participation and how forthcoming the platform executives were.

The association launches at quite a moment — a point when policymakers (even heads of state) in this and other countries are questioning whether the platforms should be defining and protecting freedom of expression online; when the U.S. law that allows them to moderate content is being challenged more than ever; in a crucial election year for many platforms’ home country (with Facebook just now allowing U.S. users to opt out of political ads); and when the public discussion about how to protect content moderators themselves, from hateful, violent and criminal content is finally taking off.

Last week I wrote about this pivotal moment and, here in Medium, proposed five things that need to happen for social media to serve us better. I left one out. I didn’t know this fantastic development — TSPA and the TSF Project — was happening until the IEF sent me a preview yesterday. So this is really the sequel to my last post.

Filling in the user care picture

This launch is a 6th essential step, but not in any linear sense. It’s part of Nos. 1 and 2: cross-platform collaboration at a new level (TSPA has support and together solving the most fundamental riddle of social media content moderation. It’s also essential in its own right, a logical complement to Nos. 3 and 4, building out a global association of Internet helplines and making the Oversight Board for content moderation appeals cross-platform. The latter was started by Facebook but is now an independent entity currently focused only on Facebook content (“only” should be in quotes, given Facebook’s 2+ billion users, but this too, like TSPA, needs to be cross-industry). №5 proposes internal “Pro-Social Media Teams” that liaise with their counterpart platforms’ across the industry and internationally; the work of such teams would logically include liaising with TSPA.

In fact, with the Trust & Safety Association’s launch, a more complete picture of social media user care is truly beginning to take shape: with independent bodies for supporting sound content moderation on the platforms (and educating the public thereof), appealing content moderation decisions platforms have made and providing help to users confronted with harmful content.

Together, those three services—consciously and conscientiously supported by the platforms—have tremendous potential for solving the puzzle of global social media user care and getting us to the point where our media serve us and our societies.

Youth advocate; blogger, NetFamilyNews.org; founder, The Net Safety Collaborative

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store