My name is danah boyd and I'm a Principal Researcher at Microsoft Research and the founder/president of Data & Society. Buzzwords in my world include: privacy, context, youth culture, social media, big data. I use this blog to express random thoughts about whatever I'm thinking.

Relevant links:

Archive

When Good Intentions Backfire

… And Why We Need a Hacker Mindset


I am surrounded by people who are driven by good intentions. Educators who want to inform students, who passionately believe that people can be empowered through knowledge. Activists who have committed their lives to addressing inequities, who believe that they have a moral responsibility to shine a spotlight on injustice. Journalists who believe their mission is to inform the public, who believe that objectivity is the cornerstone of their profession. I am in awe of their passion and commitment, their dedication and persistence.

Yet, I’m existentially struggling as I watch them fight for what is right. I havelearned that people who view themselves through the lens of good intentions cannot imagine that they could be a pawn in someone else’s game. They cannot imagine that the values and frames that they’ve dedicated their lives towards — free speech, media literacy, truth — could be manipulated or repurposed by others in ways that undermine their good intentions.

I find it frustrating to bear witness to good intentions getting manipulated,but it’s even harder to watch how those who are wedded to good intentions are often unwilling to acknowledge this, let alone start imagining how to develop the appropriate antibodies. Too many folks that I love dearly just want to double down on the approaches they’ve taken and the commitments they’ve made. On one hand, I get it — folks’ life-work and identities are caught up in these issues.

But this is where I think we’re going to get ourselves into loads of trouble.

The world is full of people with all sorts of intentions. Their practices and values, ideologies and belief systems collide in all sorts of complex way. Sometimes, the fight is about combating horrible intentions, but often it is not. In college, my roommate used to pound a mantra into my head whenever I would get spun up about something: Do not attribute to maliciousness what you can attribute to stupidity. I return to this statement a lot when I think about how to build resilience and challenge injustices, especially when things look so corrupt and horribly intended — or when people who should be allies see each other as combatants. But as I think about how we should resist manipulation and fight prejudice, I also think that it’s imperative to move away from simply relying on “good intentions.”

I don’t want to undermine those with good intentions, but I also don’t want good intentions to be a tool that can be used against people. So I want to think about how good intentions get embedded in various practices and the implications of how we view the different actors involved.

The Good Intentions of Media Literacy

When I penned my essay “Did Media Literacy Backfire?”, I wanted to ask those who were committed to media literacy to think about how their good intentions — situated in a broader cultural context — might not play out as they would like. Folks who critiqued my essay on media literacy pushed back in all sorts of ways, both online and off. Many made me think, but some also reminded me that my way of writing was off-putting. I was accused of using the question “Did media literacy backfire?” to stoke clicks.Some snarkily challenged my suggestion that media literacy was even meaningfully in existence, asked me to be specific about which instantiations I meant (because I used the phrase “standard implementations”), and otherwise pushed for the need to double down on “good” or “high quality” media literacy. The reality is that I’m a huge proponent of their good intentions — and have long shared them, but I wrote this piece because I’m worried that good intentions can backfire.

While I was researching youth culture, I never set out to understand what curricula teachers used in the classroom. I wasn’t there to assess the quality of the teachers or the efficacy of their formal educational approaches. I simply wanted to understand what students heard and how they incorporated the lessons they received into their lives. Although the teens that I met had a lot of choice words to offer about their teachers, I’ve always assumed that most teachers entered the profession with the best of intentions, even if their students couldn’t see that. But I spent my days listening to students’ frustrations and misperceptions of the messages teachers offered.

I’ve never met an educator who thinks that the process of educating is easy or formulaic. (Heck, this is why most educators roll their eyes when they hear talk of computerized systems that can educate better than teachers.) So why do we assume that well-intended classroom lessons — or even well-designed curricula — might not play out as we imagine? This isn’t simply about the efficacy of the lesson or the skill of the teacher, but the cultural context in which these conversations occur.

In many communities in which I’ve done research, the authority of teachers is often questioned. Nowhere is this more painfully visible than when well-intended highly educated (often white) teachers come to teach in poorer communities of color. Yet, how often are pedagogical interventions designed by researchers really taking into account the doubt that students and their parents have of these teachers? And how do we as educators and scholars grapple with how we might have made mistakes?

I’m not asking “Did Media Literacy Backfire?” to be a pain in the toosh, but to genuinely highlight how the ripple effects of good intentions may not play out as imagined on the ground for all sorts of reasons.

The Good Intentions of Engineers

From the outside, companies like Facebook and Google seem pretty evil to many people. They’re situated in a capitalist logic that many advocates and progressives despise. They’re opaque and they don’t engage the public in their decision-making processes, even when those decisions have huge implications for what people read and think. They’re extremely powerful and they’ve made a lot of people rich in an environment where financial inequality and instability is front and center. Primarily located in one small part of the country, they also seem like a monolithic beast.

As a result, it’s not surprising to me that many people assume that engineers and product designers have evil (or at least financially motivated) intentions. There’s an irony here because my experience is the opposite.Most product teams have painfully good intentions, shaped by utopic visions of how the ideal person would interact with the ideal system. Nothing is more painful than sitting through a product design session with design personae that have been plucked from a collection of clichés.

I’ve seen a lot of terribly naive product plans, with user experience mockups that lack any sense of how or why people might interact with a system in unexpected ways. I spent years tracking how people did unintended things with social media, such as the rise of “Fakesters,” or of teenagers who gamed Facebook’s system by inserting brand names into their posts, realizing that this would make their posts rise higher in the social network’s news feed. It has always boggled my mind how difficult it is for engineers and product designers to imagine how their systems would get gamed. I actually genuinely loved product work because I couldn’t help but think about how to break a system through unexpected social practices.

Most products and features that get released start with good intentions, but they too get munged by the system, framed by marketing plans, and manipulated by users. And then there’s the dance of chaos as companies seek to clean up PR messes (which often involves non-technical actors telling insane fictions about the product), patch bugs to prevent abuse, and throw bandaids on parts of the code that didn’t play out as intended. There’s a reason that no one can tell you exactly how Google’s search engine or Facebook’s news feed works. Sure, the PR folks will tell you that it’s proprietary code. But the ugly truth is that the code has been patched to smithereens to address countless types of manipulation and gamification(e.g., SEO to bots). It’s quaint to read the original “page rank” paper that Brin and Page wrote when they envisioned how a search engine could ideally work. That’s so not how the system works today.

The good intentions of engineers and product people, especially those embedded in large companies, are often doubted as sheen for a capitalist agenda. Yet, like many other well-intended actors, I often find that makers feel misunderstood and maligned, assumed to have evil thoughts. And I often think that when non-tech people start by assuming that they’re evil, we lose a significant opportunity to address problems.

The Good Intentions of Journalists

I’ve been harsh on journalists lately, mostly because I find it so infuriating that a profession that is dedicated to being a check to power could be so ill-equipped to be self-reflexive about its own practices.

Yet, I know that I’m being unfair. Their codes of conduct and idealistic visions of their profession help journalists and editors and publishers stay strong in an environment where they are accustomed to being attacked. It just kills me that the cultural of journalism makes those who have an important role to play unable to see how they can be manipulated at scale.

Sure, plenty of top-notch journalists are used to negotiating deception and avoidance. You gotta love a profession that persistently bangs its head against a wall of “no comment.” But journalism has grown up as an individual sport; a competition for leads and attention that can get fugly in the best of configurations. Time is rarely on a journalist’s side, just as nuance is rarely valued by editors. Trying to find “balance” in this ecosystem has always been a pipe dream, but objectivity is a shared hallucination that keeps well-intended journalists going.

Powerful actors have always tried to manipulate the news media, especially State actors. This is why the fourth estate is seen as so important in the American context. Yet, the game has changed, in part because of the distributed power of the masses. Social media marketers quickly figured out that manufacturing outrage and spectacle would give them a pathway to attention, attracting news media like bees to honey. Most folks rolled their eyes, watching as monied people played the same games as State actors. But what about the long tail? How do we grapple with the long tail? How should journalists respond to those who are hacking the attention economy?

I am genuinely struggling to figure out how journalists, editors, and news media should respond in an environment in which they are getting gamed.What I do know from 12-steps is that the first step is to admit that you have a problem. And we aren’t there yet. And sadly, that means that good intentions are getting gamed.

Developing the Hacker Mindset

I’m in awe of how many of the folks I vehemently disagree with are willing to align themselves with others they vehemently disagree with when they have a shared interest in the next step. Some conservative and hate groups are willing to be odd bedfellows because they’re willing to share tactics, even if they don’t share end goals. Many progressives can’t even imagine coming together with folks who have a slightly different vision, let alone a different end goal, to even imagine various tactics. Why is that?

My goal in writing these essays is not because I know the solutions to some of the most complex problems that we face — I don’t — but because I think that we need to start thinking about these puzzles sideways, upside down, and from non-Euclidean spaces. In short, I keep thinking that we need more well-intended folks to start thinking like hackers.

Think just as much about how you build an ideal system as how it might be corrupted, destroyed, manipulated, or gamed. Think about unintended consequences, not simply to stop a bad idea but to build resilience into the model.

As a developer, I always loved the notion of “extensibility” because it was an ideal of building a system that could take unimagined future development into consideration. Part of why I love the notion is that it’s bloody impossible to implement. Sure, I (poorly) comment my code and build object-oriented structures that would allow for some level of technical flexibility. But, at the end of the day, I’d always end up kicking myself for not imagining a particular use case in my original design and, as a result, doing a lot more band-aiding than I’d like to admit. The masters of software engineering extensibility are inspiring because they don’t just hold onto the task at hand, but have a vision for all sorts of different future directions that may never come into fruition. That thinking is so key to building anything, whether it be software or a campaign or a policy. And yet, it’s not a muscle that we train people to develop.

If we want to address some of the major challenges in civil society, we need the types of people who think 10 steps ahead in chess, imagine innovative ways of breaking things, and think with extensibility at their core. More importantly, we all need to develop that sensibility in ourselves. This is the hacker mindset.

This post was originally posted on Points. It builds off of a series of essays on topics affecting the public sphere written by folks at Data & Society. As expected, my earlier posts ruffled some feathers, and I’ve been trying to think about how to respond in a productive manner. This is my attempt.

Flickr Image: CC BY 2.0-licensed image by DaveBleasdale.

Print Friendly

Comments are closed.