The Big Tech ‘fix’: Not either-or but a bit of both +

Anne Collier
10 min readMay 7, 2019

It wasn’t because of the headline — “Zuckerberg and Warren want to fix Big Tech in different ways — and neither of them will work” — that I was excited to read this article by researchers in Science, Technology & Society at Harvard. The headline itself sounded too much like more of the same reflexive punditry we’ve been hearing since the US’s Big Data wakeup call of 2018.

I was excited to see its authors saying that the solution to today’s privacy, safety and ethics concerns around “Big Tech” is not either-or. It’s neither the tech fix of the Mark Zuckerberg school of thought nor the antitrust fix of Sen. Elizabeth Warren’s. [The authors — Maciej Kuziemski, Nina Frahm and Kasper Schioelin — are also fair: They acknowledge that Facebook CEO Zuckerberg has shown openness to regulatory support in fixing tech’s problems and that Senator Warren is campaigning for her party’s presidential nomination.]

‘Tech solutionism’

But the tech determinism that they boil down to Zuckerberg’s approach is actually what many of us call for too, unfortunately, when we believe the platforms alone created this “monster” and so have to fix it. The problem with that is that they can’t all by themselves, whether by writing killer algorithms, hiring even millions of human moderators or redesigning products — though all those things are good. We, like “Zuckerberg,” are leaving us out of the equation too much. Besides, with tech constantly changing, both blaming tech and believing in tech-only fixes “leaves us in a constant and somewhat perplexed mode of risk control,” the authors write. I agree. I’ve been watching that happen for 20+ years in youth Internet safety and don’t feel it serves our children.

Just touching on regulation

Then there’s the Warren formula: the “antitrust fix.” Though some degree of regulation is needed, the antitrust “fix” isn’t pure solution either. For one thing, it would probably create as many problems as it solved. It would break up US-based global companies and possibly increase competition to a degree, but what would that leave us with? Does competition have greater value than collaboration in these times? Would smaller US-based Internet companies automatically be more ethical and less proprietary with our data, and would they be able to compete with China-based global companies such as the $75 billion startup ByteDance that owns TikTok or the nearly $500 billion Tencent, the world’s biggest videogame company that also owns WeChat, which rivals Whatsapp and Facebook Messenger in some markets? These are companies much more subject to the control of a government known all over the world for its censorship and surveillance powers. Just last week, a commentary in the New York Times asked if we should “worry about how China uses apps like TikTok.” [For more on this, see the sidebar below.]

Besides the geopolitical challenge, it’s delusional to think there are once-and-for-all regulatory fixes. Kuziemski, Frahm and Schioelin cite USC tech and society scholar Donna Haraway in suggesting that we need to “stay with the trouble,” in this case the trouble being technology that keeps inconveniently evolving. Government action and laws will have to keep evolving with the technology — or include expiration dates!

We’re leaving out us

Just a few examples of how neither tech nor government regulation is the ultimate solution. It’s neither___ nor___ but partly both and: tech reform and design as well as innovative regulation (emphasis on “innovative”), plus something glaringly missing. We, the citizens, are missing in the equation, the authors write. Let’s consider that for a moment, then I’ll suggest how we might build out our part.

Both the “technical fix” and the “antitrust fix” leave the citizens — the people without whose content and data this unruly media environment wouldn’t exist — “completely powerless,” Kuziemski, Frahm and Schioelin write. And I’d add so do we citizens, strangely. Collectively, we’re not seeing it yet. Too often we, not just policymakers and companies, speak of ourselves and our children as mere consumers and, worse, potential victims of the media in which we’re participating.

“Instead of concentrating on the power of corporations, we should be more concerned with the power of the people to mobilize their own visions and concerns about technologies,” the authors write. It will help to move from old adversarial approaches that exclude key views and expertise to cross-functional approaches that fold in all the stakeholders’ (at the very least the citizenry’s, tech’s and policy’s) needed perspectives. This may sound idealistic, but, for solving the complex problems of this complex networked world, multiple inputs is just necessary, and increasingly so.

3 parts to our part

So what might the citizenry’s part of the equation look like? So far I see three parts to our piece of the equation:

  • Conscious citizenship: the kind Kuziemski, Frahm and Schioelin are describing, citing other thinkers’ descriptions (they’re actually helping to move the needle toward a definition of “digital citizenship” that is actually what I call “citizen-sourced”)
  • Forums for citizens to gather, reflect, deliberate, shape and advance their roles and powers. An example is the emerging forum called “All Tech Is Human,” which will likely have both digital and in-person versions. [Disclosure: I serve as an adviser to ATIH.]
  • The infrastructure of citizen support: This would have many moving parts, both national and international, sourced from citizens, governments and corporations. I suggest we need to be thinking of them as parts of a whole—figuring out how they work together and fill holes toward meeting the citizens’ needs (what forums and deliberating bodies are for). The infrastructure will include formal and informal bodies by which the citizens 1) represent their interests, 2) exercise their powers and 3) surface solutions that feed into the appropriate recipients in both governments and corporations, e.g., Australia’s eSafety Commissioner, Article19.org’s Social Media Councils, Facebook’s Oversight Board (which I believe needs to become cross-platform) and the UK government’s proposed independent regulator.

A number of infrastructure pieces are emerging in an ad hoc sort of way: for example, a) the authors gave some examples in Europe such as the European Citizens’ Initiative; b) Tarleton Gillespie, principal researcher at Microsoft Research, recently put forth a number of ideas in Wired and the Georgetown Technology Law Review, including user-serving ombudsmen inside companies and an independent Experts Advisory Panel; c) Facebook has started the development of a global independent “Oversight Board” for content takedown appeals (which I wrote about here, with some predictions for it); and d) national helplines (many being independent NGOs or parts of NGOs) and content deletion centers around the world answering takedown calls from citizen users in their countries and providing context and takedown requests to the platforms, all of which can contribute and pool (anonymous, aggregated) data on citizen users’ needs.

‘Technologies of humility’

Many of these are parts of what I call the new and developing “middle layer” of moderation and regulation that corporations and governments can’t do on their own. It supports and empowers citizens to report problems and participate in solutions.

We need their perspectives and wisdom too. Harvard science & society scholar Sheila Jasanoff, cited by Kuziemski, Frahm and Schioelin, calls for “technologies of humility,” which she describes as “institutionalized habits of thought” that embrace “the unknown and the uncertain” which so characterize life in this not-so-brave new world. Along with being as much about the process as the problem and focused as much on deliberation as analysis, the technologies of humility would engage the citizen “as an active, imaginative agent, as well as a source of knowledge, insight, and memory.” Which points to one of the best descriptions of a “digital citizen” I’ve seen yet.

Anne Collier is founder and executive director of The Net Safety Collaborative, home of SocialMediaHelpline.com. She has been writing about youth and digital media at NetFamilyNews.org since before there were blogs and advising tech companies since 2009 (full-length bio here).

SIDEBAR: Facebook co-founder on the ‘antitrust fix’

In a commentary in the New York Times, co-founder and board member Chris Hughes apologizes for saying it so late, but he throws in with the government-can-fix-this school of thought. He writes that “we are a nation with a tradition of reining in monopolies,“ but I’m not sure tradition has the answers for this discontinuous time and this new kind of social institution called “platforms.” His commentary is largely US-centric, mentioning only at the very end Europe’s GDPR on the positive side and Chinese tech giants’ growing power on the downside. So the question remains, would breaking up this or any of the tech monopolies actually solve the geopolitical problems of surveillance and security breaches while presenting enough competition to tech giants based in authoritarian countries?

Hughes writes that Zuckerberg is a “good, kind person” but “Mark’s power is unprecedented and un-American.” If he’s right (and with Zuckerberg’s 60% control of the company’s shares it seems he is), then the need is for regulators to break with antitrust tradition and 1) figure out a way to break up US CEOs’ power instead of companies that have the resources to match the power of overseas companies that are not affected by US regulation and 2) ensure compliance of US companies with state-of-the-art privacy law and ethics that evolve with technology—perhaps with a new regulatory body that Hughes points to which is not unlike what the British government calls for in its white paper. We don’t know yet. We, including the citizens, the corporations, researchers and policymakers, can’t know until we can talk about this!

As sweeping (and persuasive) as Hughes’s commentary is—persuasive certainly emotionally, in this fraught time of maximum distrust of tech and tech companies—it just might actually be as myopic as its writer’s criticism of Facebook is. We need more than government and an antitrust tradition now. We need multi-party, multi-sector global checks and balances. Can the citizens talk about that?

Related links

  • Nobel economics prize winner Jean Tirole suggests “participative antitrust,” an alternative to self-regulation and traditional government regulation “in which the industry or other parties propose possible regulations and the antitrust authorities issue some opinion, creating some legal certainty without casting the rules in stone.”
  • My way or the highway: Zuckerberg and Warren want to fix Big Tech in different ways — and neither of them will work,” by Marciej Kuziemski, Nina Frahm and Kasper Schioelin
  • All Tech Is Human, an emerging provider of citizen forums both digital and physical — the next one in Seattle later this month
  • An incomplete solution: an article by Canadian scholars calling the UK government’s Online Harms White Paper “a responsible, if incomplete, attempt to address real social issues”
  • “We the people” not we the frogs: That’s a reference to the scary fable of the frog that jumps out of boiling water but is boiled alive when put in a pot of water warm water that slowly comes to a boil. It’s used by Amy Webb, author of the just-published book The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity (thanks to Stephen Balkam, CEO of the Family Online Safety Institute for bringing it to my attention). Her “big nine” is a reference to three Chinese giants (Alibaba, Baidu and Tencent) and six American ones: Google, IBM, Microsoft, Facebook, Amazon and Apple. She cautions against referencing the past for fixing things now (including Senator Warren’s “fix”) and suggests we, as quickly as possible, figure out how to “incentivize” the American companies to “refine and recalibrate their [market-based] agendas so that people are at the center of what they’re building” and so that they “put safety before speed and build out their technologies in a way that…truly benefits everybody.” She writes about how the nine companies are currently incentivized (government or market forces) and how, despite good intentions, how distracted the US ones are from the social good and how little western governments understand the technology, the companies and the need. Also for further reading: Another, more focused, just-published take on what’s happening is Coders: The Making of a New Tribe and the Remaking of the World, by Clive Thompson, which dives deep into the lives and world of computer programmers.
  • To adopt the “technologies of humility” mentioned above, Harvard scholar Sheila Jasanoff suggests we focus on four factors: “framing” (how we, people and policymakers, are framing the problem; “vulnerability” and resilience levels of the subjects of discussion (people); “distribution” (the distributive nature of impacts across “global societies and markets,” and I would add unintended consequences thereof); and “learning” that folds in multiple perspectives, never excluding the citizens’ (and I would add children, where tech’s effects are concerned) — in her paper on citizen participation in the governing tech and science.
  • Beyond Silicon Valley: Well-known child online safety expert John Carr in the UK recently blogged, “The world cannot continue to be corralled by a set of values that grew up on the West Coast of the USA…. There must be a better way.” Agreed, which is why the solution needs to be an ecosystem of participation worldwide from citizens (including those under 18), governments and companies which recognizes the value of multiple perspectives and seeks equal participation (to which I suspect he agrees).
  • From the “What, me worry?” Dept.: An academic commentator in the New York Times last week suggested why “We Should Worry about How China Uses Apps Like TikTok.” Yes, these concerns could just be the next wave of our rolling moral panic but also maybe not! So see a range of views in this quite balanced report in Metro.co.uk.
  • More on China: How WeChat might be affecting Australian elections is a big news story there, for example: “‘Uncharted territory’: WeChat’s new role in Australian public life raises difficult questions“ and “How the CCP can control Aussie pollies on WeChat
  • Algorithms as black boxes: The Dark Secret at the Heart of AI” at MIT Technology Review — all the more reason why we need the tech perspective in the mix and, equally if not more important, diverse perspectives in tech
  • Algorithms for joy?: “Regulating these technologies requires an interdisciplinary approach involving legal, policy, social, and technical experts working closely with industry, government, and consumers to get them to work the way we want them to,” wrote MIT Media Lab director Joi Ito in Wired, adding: “What gets less attention rather than outright restriction is how we might optimize these platforms to provide joy, positive engagement, learning, and healthy communities for young people and families…. Can algorithms be optimized for learning, high-quality content, and positive intergenerational communication for young people?”

Anne Collier is founder and executive director of The Net Safety Collaborative, home of the U.S.’s social media helpline for schools. She has been writing about youth and digital media at NetFamilyNews.org since before there were blogs (1997) and advising tech companies since 2009 (full-length bio here).

--

--

Anne Collier

Youth advocate; blogger, NetFamilyNews.org; founder, The Net Safety Collaborative