It’s here! You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape by Whitney Phillips and Ryan M. Milner has been published this week (MIT Press, March 2) and it’s wonderful. Throughout the book, the authors use the natural world and the threats it faces as a metaphor for the network “pollution” that we all experience, pollution that hardens polarization, distributes misinformation, and knows no borders. This metaphor allows the authors to focus on how polluted information spreads and what to do about it rather than examining motives or assigning political blame; it also points to the unequal social burden of this pollution, similar to environmental racism. Polluted information is nothing new, but in recent years we’ve built a network system that amplifies and spreads it with great efficiency. This polluted environment (one the US is particularly responsible for building) is one that is interconnected and one we all share, so it requires a communal effort to restore it to health. You’re here. I’m here. What we do affects us all. The authors write:
One of the stickiest issues we face is how to fix the internet so it isn’t a democracy-threatening amplifier of disinformation and a tool to incite racist, fascist hate and violence. It’s an old problem. While John Perry Barlow’s Declaration of the Independence of Cyberspace argued against any curbs on speech online, appealing to the naïve yet foundational myth that speech, like markets, would regulate itself wisely out of enlightened self-interest, there has never been a prelapsarian Internet where there was no garbage to take out. Email wouldn’t function without some spam controls, and platforms have had to learn how to limit the spread of child pornography and unauthorized sharing of copyrighted material, however imperfectly, because the legal costs of not doing so were significant. The harder job is deciding what speech is unacceptable when the scale of these platforms is global and vast and both Mammon and mischief drive what speech gets the most reach. Jillian C. York takes on the complexity of that challenge in Silicon Values: The Future of Free Speech Under Surveillance Capitalism (Verso, March 2).
It’s been a moment. About five exhausting years of moments. A telescoping sequence of “oh my god” moments, from Muslim bans and kids in cages and Charlottesville and Pizzagate–ha ha, remember that goofiness? ha ha, so weird–to ever more urgent signs of imminent climate catastrophe, to a mishandled pandemic to gamified delusional bonding online to a mass uprising against state violence against black folks to open insurrection by a flag-waving white supremacist alliance. The attack on the Capitol seemed like a bizarre season finale, broadcast live, that got more and more real as the media created by the insurrectionists to memorialize their moment was scraped and assembled. It was so much worse than it first seemed, than we could at first believe. And it’s far from over.
Ever since I joined a discussion group for mysteries some fifteen years ago (4MysteryAddicts, once on Yahoo Groups and now on Groups.io) I’ve participated in the annual tradition of selecting and sharing our top ten mysteries of the year. Seeing what books others enjoyed reading is a great way of getting ideas for what to read next, which if you’re addicted to mysteries is a serious question. I’ll share my list here, as I’ve done most years, along with noting some novels and non-fiction that I also enjoyed reading. Making these lists provide a good opportunity to take stock and look back. Given what a horrid year 2020 was, the reading was surprisingly good.
I finished finishing a book recently. That is, I completed a draft of a mystery ages ago, let it sit in a quiet place for a while to ripen, and then took it out and decided it needed some fairly significant changes. Then it had to sit in a quiet place again. I made more changes, and I thought it was done but as soon as I uploaded the chapters and saw it through a different layout (funny how that works) I discovered a dozen missing words, duplicate phrases, etc. –small mistakes that had hidden in plain sight, so I went through it all again. And when that was done, I made a cover, switched the setting to “public,” and uploaded a copy to the Internet Archive. Though I’m sure there are still some glitches, and possibly some glaring errors, it’s out there now, free for anyone who feels like reading a mystery.
. . . it just eats budgets, deepens inequality, and gathers enormous amounts of data (while often invading students’ privacy in new ways). Earlier this fall I read Justin Reich’s Failure to Disrupt: Why Technology Alone Can’t Transform Education, and I had the opportunity to ask the author some questions as part of Project Information Literacy’s Smart Talk series. The book is a very thorough and even-handed overview of how technology has been introduced into classrooms, first examining how these technologies approach learning (massive online courses, computer-aided personalized learning, and using technology to build learning communities) and then unpacking the problems that have hindered tech’s promise. These have only become more obvious since the pandemic sent students and teachers home and Zoom became a common verb for online interaction.
It’s always a red-letter day when Project Information Literacy comes out with a report. As a librarian, I always found them thought-provoking, reassuring (oh good, it’s not just our students who do that), and inspiring models of research methods, setting the bar for both qualitative and quantitative research on a scale most librarians can only dream of. These reports have always been great to share with faculty across the campus who play a key role in learning how to navigate information. It really helps to have solid research when you’re advocating for information literacy instruction with scholars who want evidence.
My attention span hasn’t been great lately. I snapped up some advanced copies of books about tech and society (thank you Netgalley) but it’s all too easy to let my attention slip to checking the news or, worse, to Twitter, home of the social media paradox: the platform depends on attention, while totally obliterating it. Tim Hwang’s new book, Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet, out in mid-October, confirms a suspicion I’ve had for a long time. Targeted advertising doesn’t work – but because it drives so much of what we think of as “the internet” today, its approaching failure threatens to create widespread damage to our entire information infrastructure.
While doing some research, my partner got into a trove of old newspaper content. He idly tried searching family names and found an amazing number of mentions: birth announcements that listed every sibling’s name, meetings and election of officers of local organizations, a new job, the minor injury of a child (now in his seventies) in an accident. There was even a short article about a local man who was mentioned in a textbook my father wrote. Something important to an individual was considered worth sharing with the wider public.
The New York Technical Services Librarians, an organization that has been active since 1923 – imagine all that has happened in tech services since 1923! – invited me to give a talk about bias in algorithms. They quickly got a recording up on their site and I am, more slowly, providing the transcript. Thanks for the invite and all the tech support, NYTSL!
The Bigot in the Machine: Bias in Algorithmic Systems
Abstract: We are living in an “age of algorithms.” Vast quantities of information are collected, sorted, shared, combined, and acted on by proprietary black boxes. These systems use machine learning to build models and make predictions from data sets that may be out of date, incomplete, and biased. We will explore the ways bias creeps into information systems, take a look at how “big data,” artificial intelligence and machine learning often amplify bias unwittingly, and consider how these systems can be deliberately exploited by actors for whom bias is a feature, not a bug. Finally, we’ll discuss ways we can work with our communities to create a more fair and just information environment.