One of the stickiest issues we face is how to fix the internet so it isn’t a democracy-threatening amplifier of disinformation and a tool to incite racist, fascist hate and violence. It’s an old problem. While John Perry Barlow’s Declaration of the Independence of Cyberspace argued against any curbs on speech online, appealing to the naïve yet foundational myth that speech, like markets, would regulate itself wisely out of enlightened self-interest, there has never been a prelapsarian Internet where there was no garbage to take out. Email wouldn’t function without some spam controls, and platforms have had to learn how to limit the spread of child pornography and unauthorized sharing of copyrighted material, however imperfectly, because the legal costs of not doing so were significant. The harder job is deciding what speech is unacceptable when the scale of these platforms is global and vast and both Mammon and mischief drive what speech gets the most reach. Jillian C. York takes on the complexity of that challenge in Silicon Values: The Future of Free Speech Under Surveillance Capitalism (Verso, March 2).
She works as the International Activism Director for the Electronic Frontier Foundation, which has successfully defended cyber rights of privacy and free speech in court. One of its founders was, in fact, John Perry Barlow, and some have criticized the organization for taking in millions of dollars from Google and Facebook. (Yasha Levine has gone so far as to accuse the EFF of being a corporate front presenting itself as a grassroots organization. My take is that it’s an organization that represents good causes and also often finds them aligned with the goals of Big Tech, which is happy to promote those goals through cash and digital mobilization. In other words, it’s messy.)
York’s discussion of free speech online comes at an interesting moment. The illusion of social media companies and the connections they facilitate being a force for global democracy, celebrated in the Arab Spring (which, in reality, depended on local activism and IRL courage more than engineers in California) has been replaced by fear of capitalist mind-control or of “coastal elite” censorship of conservative speech. Both visions are exaggerated and oversimplified, and in the case of conservative agita, demonstrably false, though highly successful at recruiting the grievance vote and cowing social media platforms.
Into this complex situation Jillian C. York wades, holding tight to two principles: free speech matters, and we can’t count on tech corporations or the state to preserve it for us. She reviews the problems with our “new gatekeepers” and their uneven and reluctant attempts to match their push for growth with responsibility for its effects. She is well-positioned to take a global view through her years of work with civil society organizations around the world and to point out how Californian assumptions simply don’t apply everywhere. Valuing profits over mission will lead to oppression, putting ignorant people in charge of difficult decisions will lead to trouble, and people with power are much less likely to be censored than those without. As she puts it, decision-making is increasingly undemocratic.
Companies like Facebook may prize ingenuity when it comes to engineering, but thinking outside the box is no longer valued in policy circles. In fact, policy hires increasingly come from government, law enforcement, or the policy teams of other corporations, which has created revolving door through which only a certain subset of people can enter.
As such, Silicon Valley policymaking has increasingly come to resemble that of government; and given the absence of any vestige of democratic participation, that means authoritarian government. (p. 92)
A following section header puts it succinctly: media corporations are “repression’s little helpers.”
She offers an intriguing look at how trying to control terrorist recruitment messages from ISIS simultaneously erased from the public record evidence of war atrocities in Syria. She also delves into ways a small number of companies are trying to control gender expression in prudish ways that suppress non-conforming gender expression and endanger sex workers. She explores the unevenness of attention to problematic content due to lack of regional and linguistic knowledge and unwillingness to expend the resources it would take to make global companies actually responsible for the damage they cause. Moderation cannot be done at scale. You cannot hire enough people to do it (and the work is horribly damaging) and you can’t automate it.
This is a problem that will not go away. Algorithms are simply incapable of encapsulating the human experience, regardless of what Silicon Valley would have us believe. And once companies have taken humans out of the loop and relinquished the reins to machines, there is no telling the sort of cultural norms they will propagate in the future. (p. 182)
York traces the problem of hate speech to the strategies developed during Gamergate that are now playing an enormous role in the global rise of right-wing racist and anti-democratic populist movements (such as the latest version of the Republican party in the US which is currently doing its best to rewrite election laws to disenfranchise the majority). It also has strong representation in Silicon Valley, where leadership is “rife with white supremacist sympathizers” (p. 201). In Myanmar, genocide; in the US, an attack on the capitol attempting to overturn an election. Decisions about who speaks and who it will reach that shake the world are being made by a small handful of wealthy people with no accountability.
So what is the solution? York’s final chapter addresses this work in general terms. Content moderation is broken and needs an independent and comprehensive audit to recommend new rules and processes. The design of technology needs greater inclusion of non-white, non-American perspectives and workers. We can’t count on corporations or governments to repair the mess; we need to “listen and learn from one another, connect the dots, and build concrete tactics and solutions that address problems holistically” (p. 226-7). We need to tackle the social injustices that have been exacerbated by big tech.
As is common in books about the trouble with tech, there’s a lot more detail about what’s gone wrong than on how to right it. But York encourages us to not give up hope.
In order to develop a better future, we must learn from the past, and recognize that the future is not yet written, that it doesn’t have to be this way. There are countless moments throughout history where a seemingly minor decision or discovery led us on a new and different trajectory or, inversely, where the failure to act in the face of an atrocity led to even greater suffering. Our willingness and ability to study, examine, and explore our mistakes and to imagine alternative futures is imperative.
The continued erosion of democracy, free expression, and human rights is not inevitable, but can only be stopped if we take immediate action. The future is ours to write.
If you would like some concrete ideas about how to write that future, I find Ethan Zuckerman’s podcast Reimagining the Internet consistently interesting. And I recommend a book coming out on the very same day as York’s, You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape by Whitney Phillips and Ryan M. Milner, which I hope to write about later this week.