Category Archives: Uncategorized

There was a bomb on my block.

I live in Manhattan, in Chelsea, on 27th Street between 6th and 7th, the same block in which the second IED was found. It was a surreal weekend, but it is increasingly becoming depressing as the media moves from providing information to stoking fear, the exact response that makes these events so effective. I’m not afraid of bombs. I’m afraid of cars. And I’m increasingly becoming afraid of American media.

After hearing the bomb go off on 23rd and getting flooded with texts on Saturday night, I decided to send a few notes that I was OK and turn off my phone. My partner is Israeli. We’ve been there for two wars and he’s been there through countless bombs. We both knew that getting riled up was of no help to anyone. So we went to sleep. I woke up on Sunday, opened my blinds, and was surprised to see an obscene number of men in black with identical body types, identical haircuts, and identical cars. It looked like the weirdest casting call I’ve ever seen. And no one else. No cars, no people. As always, Twitter had an explanation so we settled into our PJs and realized it was going to be a strange day.

Flickr / Sean MacEntree

As other people woke up, one thing became quickly apparent — because folks knew we were in the middle of it, they wanted to reach out to us because they were worried, and scared. We kept shrugging everything off, focusing on getting back to normal and reading the news for updates about how we could maneuver our neighborhood. But ever since a suspect was identified, the coverage has gone into hyperventilation mode. And I just want to scream in frustration.

The worst part about having statistical training is that it’s hard to hear people get anxious about fears without putting them into perspective. ~100 people die every day in car crashes in the United States. That’s 33,804 deaths in a year. Thousands of people are injured every day by cars. Cars terrify me.And anyone who says that you have control over a car accident is full of shit; most car deaths and injuries are not the harmed person’s fault.

The worst part about being a parent is having to cope with the uncontrollable, irrational, everyday fears that creep up, unwarranted, just to plague a moment of happiness. Will he choke on that food? What if he runs away and gets hit by a car? What if he topples over that chair? The best that I can do is breathe in, breathe out, and remind myself to find my center, washing away those fears with each breath.

And the worst part about being a social scientist is understanding where others’ fears come from, understanding the power of those fears, and understanding the cost of those fears on the well-being of a society. And this is where I get angry because this is where control and power lies.

Traditional news media has a lot of say in what it publishes. This is one ofthe major things that distinguishes it from social media, which propagates the fears and anxieties of the public. And yet, time and time again, news media shows itself to be irresponsible, motivated more by the attention and money that it can obtain by stoking people’s fears than by a moral responsibility to help ground an anxious public.

I grew up on the internet. I grew up with the mantra “don’t feed the trolls.” I always saw this as a healthy meditation for navigating the internet, for focusing on the parts of the internet that are empowering and delightful.Increasingly, I keep thinking that this is a meditation that needs to be injected into the news ecosystem. We all know that the whole concept of terrorism is to provoke fear in the public. So why are we not holding news media accountable for opportunistically aiding and abetting terroristic acts?Our cultural obsession with reading news that makes us afraid parallels our cultural obsession with crises.

There’s a reason that hate is growing in this country. And, in moments like this, I’m painfully reminded that we’re all contributing to the culture of hate.When we turn events like what happened this weekend in NY/NJ into spectacle, when we encourage media to write stories about how afraid people are, when we read the stories of how the suspect was an average person until something changed, we give the news media license to stoke up fear. And when they are encouraged to stoke fear, they help turn our election cycle into reality TV and enable candidates to spew hate for public entertainment. We need to stop blaming what’s happening on other people and start taking responsibility.

In short, we all need to stop feeding the trolls.

Be Careful What You Code For

Most people who don’t code don’t appreciate how hard it is to do right.Plenty of developers are perfectly functional, but to watch a master weave code into silken beauty is utterly inspiring. Unfortunately, most of the code that underpins the tools that we use on a daily basis isn’t so pretty. There isa lot of digital duct tape.

CC BY-NC 2.0-licensed photo by Dino Latoga.

I’m a terrible programmer. Don’t get me wrong — I’m perfectly capable of mashing together code to get a sorta-kinda-somewhat reasonable outcome.But the product is inevitably a Frankensteinesque monstrosity. I’m not alone. This is why I’m concerned about the code that is being built. Not all code is created equally.

If you want to understand what we’re facing, consider what this would mean if we were constructing cities. In the digital world, we are simultaneously building bridges, sewage systems, and skyscrapers. Some of the bridge builders have civil engineering degrees, some of our sewage contractors have been plumbers in past lives, but most of the people building skyscrapers have previously only built tree houses and taken a few math classes. Oh, and there aren’t any inspectors to assess whether or not it’s all going to fall apart.

Code is key to civic life, but we need to start looking under the hood and thinking about the externalities of our coding practices, especially as we’re building code as fast as possible with few checks and balances.

Area One: Environmental Consequences

Let’s play a game of math. Almost 1 billion people use Gmail. More than that are active on Facebook each month. Over 300 million are active on Twitter each month. All social media — including Facebook and Twitter — send out notifications to tell you that you have new friend requests, likes, updates, etc. Each one of those notifications is roughly 50KB. If you’re relatively active, you might get 1MB of notifications a day. That doesn’t seem to be that much. But if a quarter of Gmail users get that, this means that Google hosts over 90 petabytes of notifications per year. All of that is sitting live on server so that any user can search their email and find past emails, including the new followers they received in 2007. Is this really a good use of resources? Is this really what we want when we talk about keeping data around?

The tech industry uses crazy metaphors. Artificial intelligence. Files and folders. They often have really funny roots that make any good geek giggle. (UNIX geeks, did you know that the finger command is named as such because that word meant someone is a “snitch” in the 1970s? You probably had a dirtier idea in mind.

CC BY 2.0-licensed photo by Pattys-photos.

We don’t know who started calling the cloud the cloud, but he (and it’s inevitably a he) didus all a disservice. When the public hears about the cloud, they think about the fluffy white things in the sky. What were the skies like when you were young? They went on forever…And the skies always had little fluffy clouds.” Those clouds giveth. They offer rain, which gives us water, which is the source of life.

But what about the clouds we techies make? Those clouds take. They require rare earth metals and soak up land, power, and water. Many big companies are working hard to think about the environmental impact of data centers, to think about the carbon implications. (I’m proud to work forone of them.) Big companies still have a long way to go, but at least they’re trying. But how many developers out there are trying to write green code?At best, folks are thinking about the cost-per-computation, but most developers are pretty sloppy with code and data. And there’s no LEED-certified code. Who is going to start certifying LEED code!?

In the same sense, how many product designers are thinking about the environmental impact of every product design decision they make? Product folks are talking about how notifications might annoy or engage users but not the environmental impact of them. And for all those open data zealots, is the world really better off having petabytes of data sitting on live servers just to make sure it’s open and accessible just in case? It’s painful to think about how many terabytes of data are sitting in open data repositories that have never been accessed.

And don’t get me started about the blockchain or 3D printing or the Internet of Things. At least bitcoin got one thing right: this really is about mining.

Area Two: Social Consequences

In the early 2000s, Google thought that I was a truck driver. I got the bestadvertisements. I didn’t even know how many variations of trucker speed there were! All because I did fieldwork in parts of the country that only truckers visit. Consider how many people have received online advertisements that clearly got them wrong. Funny, huh?

Now…Have you ever been arrested? Have you ever been incarcerated?

Take a moment to think about the accuracy of our advertising ecosystem — the amount of money and data that goes into making ads right. Now think about what it means that the same techniques that advertisers are using to “predict” what you want to buy are also being used to predict the criminality of a neighborhood or a person. And those that work in law enforcement and criminal justice have less money, oversight mechanisms, and technical skills.

Inaccuracy and bias are often a given in advertising. But is it OK that we’re using extraordinarily biased data about previous arrests to predict future arrests and determine where police are stationed? Is it OK that we assess someone’s risk at the point of arrest and give judges recommendations for bail, probation, and sentencing? Is it OK that local law enforcement agencies are asking tech vendors to predict which children are going to commit a crime before they’re 21? Who is deciding, and who is holding them accountable?

We might have different political commitments when it comes to policing and criminal justice. But when it comes to tech and data analysis, I hope that we can all agree that accuracy matters. Yet, we’re turning a blind eye to all of the biases that are baked into the data and, thus, the models that we build.

CC BY-NC 2.0-licensed photo by Thomas Hawk.

Take a moment to consider that 96% of cases are plead out. Those defendants never see a jury of their peers. At a minimum, 10% — but most likely much more — of those who take a plea are innocent. Why? Last I saw, the average inmate at Riker’s waits ~600 days for their trial to begin. Average. And who is more likely to end up not making bail? Certainly not rich white folks.

Researchers have long known that whites are more likely to use and sell drugs. And yet, who is arrested for drugs? Blacks. 13% of the US population is black, but over 60% of those in prison are black. Mostly for drug crimes.

Because blacks are more likely to be arrested — and more likely to be prosecuted and serve time, guess what our algorithms tell us about who is most likely to commit a drug crime? About where drug crimes occur? Police aren’t sent by predictive policing tools to college campuses. They’re sent to the hood.

Engineers argue that judges and police officers should know the limits of the data they use. Some do — they’re simply ignoring these expensive, tax-payer-costing civic technologies. But in a world of public accountability, where police are punished for not knowing someone was a risk before they shoot up a church, many feel obliged to follow the recommendations for fear of reprisal. This is how racism gets built into the structures of our systems. And civic tech is implicated in this.

I don’t care what your politics are. If you’re building a data-driven system and you’re not actively seeking to combat prejudice, you’re building a discriminatory system.

Solution: Audits and Inspection

Decisions made involving tech can have serious ramifications that are outside of the mind’s eye of development. We need to wake up. Our technology is powerful, and we need to be aware of the consequences of our code.

Before our industry went all perpetual beta, we used to live in a world where Test or Quality Assurance meant something. Rooted in those domains is a practice that can be understood as an internal technical audit. We need to get back to this. We need to be able to answer simple questions like:

  • Does the system that we built produce the right output given the known constraints?
  • Do we understand the biases and limitations of the system and the output?
  • Are those clear to the user so that our tool cannot enable poor decision-making or inaccurate impressions?
  • What are the true social and environmental costs of the service?

We need to start making more meaningful trade-offs. And that requires asking hard questions.

Audits don’t have to be adversarial. They can be a way of honestly assessing the limitations of a system and benchmarking for improvement. This approach is not without problems and limitations, but, if you cannot understand whether a model is helping or hurting, discriminating or resulting in false positives, then you should not be implementing that technology in a high stakes area where freedom and liberty are at stake.Stick to advertising.

Technology can be amazingly empowering. But only when it is implemented in a responsible manner. Code doesn’t create magic. Without the right checks and balances, it can easily be misused. In the world of civic tech, we need to conscientiously think about the social and environmental costs, just as urban planners do.