Be Careful What You Code For

Most people who don’t code don’t appreciate how hard it is to do right.Plenty of developers are perfectly functional, but to watch a master weave code into silken beauty is utterly inspiring. Unfortunately, most of the code that underpins the tools that we use on a daily basis isn’t so pretty. There isa lot of digital duct tape.

CC BY-NC 2.0-licensed photo by Dino Latoga.

I’m a terrible programmer. Don’t get me wrong — I’m perfectly capable of mashing together code to get a sorta-kinda-somewhat reasonable outcome.But the product is inevitably a Frankensteinesque monstrosity. I’m not alone. This is why I’m concerned about the code that is being built. Not all code is created equally.

If you want to understand what we’re facing, consider what this would mean if we were constructing cities. In the digital world, we are simultaneously building bridges, sewage systems, and skyscrapers. Some of the bridge builders have civil engineering degrees, some of our sewage contractors have been plumbers in past lives, but most of the people building skyscrapers have previously only built tree houses and taken a few math classes. Oh, and there aren’t any inspectors to assess whether or not it’s all going to fall apart.

Code is key to civic life, but we need to start looking under the hood and thinking about the externalities of our coding practices, especially as we’re building code as fast as possible with few checks and balances.

Area One: Environmental Consequences

Let’s play a game of math. Almost 1 billion people use Gmail. More than that are active on Facebook each month. Over 300 million are active on Twitter each month. All social media — including Facebook and Twitter — send out notifications to tell you that you have new friend requests, likes, updates, etc. Each one of those notifications is roughly 50KB. If you’re relatively active, you might get 1MB of notifications a day. That doesn’t seem to be that much. But if a quarter of Gmail users get that, this means that Google hosts over 90 petabytes of notifications per year. All of that is sitting live on server so that any user can search their email and find past emails, including the new followers they received in 2007. Is this really a good use of resources? Is this really what we want when we talk about keeping data around?

The tech industry uses crazy metaphors. Artificial intelligence. Files and folders. They often have really funny roots that make any good geek giggle. (UNIX geeks, did you know that the finger command is named as such because that word meant someone is a “snitch” in the 1970s? You probably had a dirtier idea in mind.

CC BY 2.0-licensed photo by Pattys-photos.

We don’t know who started calling the cloud the cloud, but he (and it’s inevitably a he) didus all a disservice. When the public hears about the cloud, they think about the fluffy white things in the sky. What were the skies like when you were young? They went on forever…And the skies always had little fluffy clouds.” Those clouds giveth. They offer rain, which gives us water, which is the source of life.

But what about the clouds we techies make? Those clouds take. They require rare earth metals and soak up land, power, and water. Many big companies are working hard to think about the environmental impact of data centers, to think about the carbon implications. (I’m proud to work forone of them.) Big companies still have a long way to go, but at least they’re trying. But how many developers out there are trying to write green code?At best, folks are thinking about the cost-per-computation, but most developers are pretty sloppy with code and data. And there’s no LEED-certified code. Who is going to start certifying LEED code!?

In the same sense, how many product designers are thinking about the environmental impact of every product design decision they make? Product folks are talking about how notifications might annoy or engage users but not the environmental impact of them. And for all those open data zealots, is the world really better off having petabytes of data sitting on live servers just to make sure it’s open and accessible just in case? It’s painful to think about how many terabytes of data are sitting in open data repositories that have never been accessed.

And don’t get me started about the blockchain or 3D printing or the Internet of Things. At least bitcoin got one thing right: this really is about mining.

Area Two: Social Consequences

In the early 2000s, Google thought that I was a truck driver. I got the bestadvertisements. I didn’t even know how many variations of trucker speed there were! All because I did fieldwork in parts of the country that only truckers visit. Consider how many people have received online advertisements that clearly got them wrong. Funny, huh?

Now…Have you ever been arrested? Have you ever been incarcerated?

Take a moment to think about the accuracy of our advertising ecosystem — the amount of money and data that goes into making ads right. Now think about what it means that the same techniques that advertisers are using to “predict” what you want to buy are also being used to predict the criminality of a neighborhood or a person. And those that work in law enforcement and criminal justice have less money, oversight mechanisms, and technical skills.

Inaccuracy and bias are often a given in advertising. But is it OK that we’re using extraordinarily biased data about previous arrests to predict future arrests and determine where police are stationed? Is it OK that we assess someone’s risk at the point of arrest and give judges recommendations for bail, probation, and sentencing? Is it OK that local law enforcement agencies are asking tech vendors to predict which children are going to commit a crime before they’re 21? Who is deciding, and who is holding them accountable?

We might have different political commitments when it comes to policing and criminal justice. But when it comes to tech and data analysis, I hope that we can all agree that accuracy matters. Yet, we’re turning a blind eye to all of the biases that are baked into the data and, thus, the models that we build.

CC BY-NC 2.0-licensed photo by Thomas Hawk.

Take a moment to consider that 96% of cases are plead out. Those defendants never see a jury of their peers. At a minimum, 10% — but most likely much more — of those who take a plea are innocent. Why? Last I saw, the average inmate at Riker’s waits ~600 days for their trial to begin. Average. And who is more likely to end up not making bail? Certainly not rich white folks.

Researchers have long known that whites are more likely to use and sell drugs. And yet, who is arrested for drugs? Blacks. 13% of the US population is black, but over 60% of those in prison are black. Mostly for drug crimes.

Because blacks are more likely to be arrested — and more likely to be prosecuted and serve time, guess what our algorithms tell us about who is most likely to commit a drug crime? About where drug crimes occur? Police aren’t sent by predictive policing tools to college campuses. They’re sent to the hood.

Engineers argue that judges and police officers should know the limits of the data they use. Some do — they’re simply ignoring these expensive, tax-payer-costing civic technologies. But in a world of public accountability, where police are punished for not knowing someone was a risk before they shoot up a church, many feel obliged to follow the recommendations for fear of reprisal. This is how racism gets built into the structures of our systems. And civic tech is implicated in this.

I don’t care what your politics are. If you’re building a data-driven system and you’re not actively seeking to combat prejudice, you’re building a discriminatory system.

Solution: Audits and Inspection

Decisions made involving tech can have serious ramifications that are outside of the mind’s eye of development. We need to wake up. Our technology is powerful, and we need to be aware of the consequences of our code.

Before our industry went all perpetual beta, we used to live in a world where Test or Quality Assurance meant something. Rooted in those domains is a practice that can be understood as an internal technical audit. We need to get back to this. We need to be able to answer simple questions like:

  • Does the system that we built produce the right output given the known constraints?
  • Do we understand the biases and limitations of the system and the output?
  • Are those clear to the user so that our tool cannot enable poor decision-making or inaccurate impressions?
  • What are the true social and environmental costs of the service?

We need to start making more meaningful trade-offs. And that requires asking hard questions.

Audits don’t have to be adversarial. They can be a way of honestly assessing the limitations of a system and benchmarking for improvement. This approach is not without problems and limitations, but, if you cannot understand whether a model is helping or hurting, discriminating or resulting in false positives, then you should not be implementing that technology in a high stakes area where freedom and liberty are at stake.Stick to advertising.

Technology can be amazingly empowering. But only when it is implemented in a responsible manner. Code doesn’t create magic. Without the right checks and balances, it can easily be misused. In the world of civic tech, we need to conscientiously think about the social and environmental costs, just as urban planners do.

Print Friendly, PDF & Email