Skip to Content

Biased tech design prompts a writer to call for resistance

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech

Sara Wachter-Boettcher
Norton
2017
240 pp.
Purchase this item now

Technically Wrong, Sara Wachter-Boettcher’s new book about the prejudices, slights, and snubs built into today’s technologies, ends with an observation about political protests. Reflecting on the “weird moment” we’re having right now in America, she wonders whether marching against injustice isn’t more pressing than writing a book about bad tech design. Ultimately, she decides both are important. “[A]lienating and biased technology doesn’t matter less during this time of political upheaval,” she argues. “It matters all the more.”

It’s this resolve that drives Wachter-Boettcher’s sweeping account of the failures, biases, and shortcomings of the technology products that pervade our lives: from invasive applications designed with too few kinds of users in mind to black-boxed algorithms that determine everything from the news we see to how we dole out criminal justice.

Much of the first part of the book focuses on how the choices designers make can exclude, alienate, or antagonize entire swaths of users. Sometimes, these choices constrain how users can identify within a system, such as when a site demands gender information but only offers male/female categories. Other times, they shape how designers conceive of their users in the first place, as when teams work from user profiles that represent a very narrow set of identities or life experiences.

Midway through the book, Wachter-Boettcher shifts her attention to data and algorithms, exploring problems that arise as users and activities are automatically tracked, tagged, and sorted. Here, she shows how blind spots can skew the decisions that issue from data-driven platforms in problematic ways. For example, an image-tagging system that is not tested on photos of people with a wide range of darker skin tones will inevitably favor white users.

TETRA IMAGES/ALAMY STOCK PHOTO

When tech products aren’t designed with a wide range of users in mind, they fail to be inclusive.

Once these biases are baked into different products, they are hard to shake. When an algorithm internalizes a stereotype, she notes, “the bias sticks around—long after we realize it’s there.” Wachter-Boettcher attributes these problems to the homogeneous culture of tech companies, which tend to be dominated by straight white men (and, to a lesser extent, straight white women).
When teams are composed of people from similar backgrounds, their perspectives are limited. This translates into products that fail to be inclusive.

It’s a compelling, if not entirely unique, argument. But what to do about it? Wachter-Boettcher’s prescription for change is a familiar one: Resist. “The only way the technology industry will set reasonable, humane standards…” she writes, “…is if we stop allowing it to see itself as special—stop allowing it to skirt the law, change the rules, and obfuscate the truth.”

The cases Wachter-Boettcher discusses are not new or revelatory. Most of the examples she cites have been discussed with more nuance elsewhere. Her discussion of “algorithmic inequity,” for example, leans heavily on previous investigative work by ProPublica and on Cathy O’Neil’s more detailed discussions in Weapons of Math Destruction.

But the virtue of the text is in aggregate. Collecting tech’s horror stories in one place is useful, especially because coverage of these cases—and the rightful outrage and critique they generate—too often feels disjointed, peppered across various websites, blogs, and Twitter threads. (At a conference earlier this year, a colleague and I jokingly wondered why there wasn’t a master list somewhere of all the racist, sexist, and otherwise biased practices that developers absolutely should avoid. This book comes close.)

Unfortunately, the book’s strength is also its weakness. In weaving together so many examples, Wachter-Boettcher inadvertently flattens an otherwise disparate terrain of offenses and oppressions, from the innocuous to the truly unjust. Faulty algorithms and inaccessible design aren’t simply blips that issue from a lack of diversity in tech spaces; rather, they are symptomatic of larger, systemic problems surrounding race, gender, class, and ability—problems that won’t disappear no matter how many checkboxes a site adds to its sign-up forms. At the same time, the call to “resist” feels increasingly futile when one considers the political, economic, and social power wielded by the handful of tech giants that are capable of swaying elections or silencing researchers and journalists they don’t like.

Watcher-Boettcher isn’t blind to the disproportionate power that many tech companies hold. She acknowledges that pushing back is “a hard problem” and that making the teams that produce tech more diverse probably isn’t a panacea—even diverse teams are susceptible to bad and biased data, after all. But pushback, she argues, is a place to start.

Although many of us—Watcher-Boettcher included—have been engaged in this fight for a while, some readers may be surprised by the depth and breadth of the tech world’s toxicity. Just as the current political climate has inspired many to pick up a sign and head out into the street for the first time, I hope Technically Wrong will inspire newcomers to start thinking more critically about the apps and algorithms around them.

About the author

The reviewer is at the Information School, University of Washington, Seattle, WA 98195, USA.