Monthly Archives: January 2018

Panicked about Kids’ Addiction to Tech? Here are two things you could do

Flickr: Jan Hoffman

(This was originally posted on NewCo Shift)

Ever since key Apple investors challenged the company to address kids’ phone addiction, I’ve gotten a stream of calls asking me to comment on the topic. Mostly, I want to scream. I wrote extensively about the unhelpful narrative of “addiction” in my book It’s Complicated: The Social Lives of Networked Teens. At the time, the primary concern was social media. Today, it’s the phone, but the same story still stands: young people are using technology to communicate with their friends non-stop at a point in their life when everything is about sociality and understanding your place in the social world.

As much as I want to yell at all of the parents around me to chill out, I’m painfully and acutely aware of how ineffective this is. Parents don’t like to see that they’re part of the problem or that their efforts to protect and help their children might backfire. (If you want to experience my frustration in full color, watch the Black Mirror episode called “Arkangel” (trailer here).)

Lately, I’ve been trying to find smaller interventions that can make a huge different, tools that parents can use to address the problems they panic about. So let me offer two approaches for “addiction” that work at different ages.

Parenting the Small People: Verbalizing Tech Use

In the early years, children learn values and norms by watching their parents and other caregivers. They emulate our language and our facial expressions, our quirky habits and our tastes. There’s nothing more satisfying and horrifying than listening to your child repeat something you say all too often. Guess what? They also get their cues about technology from people around them. A child would need to be alone in the woods to miss that people love their phones. From the time that they’re born, people are shoving phones in their faces to take pictures, turning to their phones to escape, and obsessively talking on their phones while ignoring them. Of course they want the attention that they see the phone as taking away. And of course they want the device to be special to them.

So, here’s what I recommend to parents of small people: Verbalize what you’re doing with your phone. Whenever you pick up your phone (or other technologies) in front of your kids, say what you’re doing. And involve them in the process if they’d like.

  • “Mama’s trying to figure out how long it will take to get to Bobby’s house. Want to look at the map with me?”
  • “Daddy’s checking out the weather. Do you want to see what it says?”
  • “Mom wants to take a picture of you. Is that OK?
  • “Papa needs a break and wants to read the headlines of the New York Times. Do you want me to read them to you?”
  • “Mommy got a text message from Mama and needs to respond. Should I tell her something from you too?”

The funny thing about verbalizing what you’re doing is that you’ll check yourself about your decisions to grab that phone. Somehow, it’s a lot less comfy saying: “Mom’s going to check work email because she can’t stop looking in case something important happens.” Once you begin saying out loud every time you look at technology, you also realize how much you’re looking at technology. And what you’re normalizing for your kids. It’s like looking at a mirror and realizing what they’re learning. So check yourself and check what you have standardized. Are you cool with the values and norms you’ve set?

Parenting the Mid-Size People: Household Contracts

I can’t tell you how many parents have told me that they have a rule in their house that their kids can’t use technology until X, where X could be “after dinner” or “after homework is done” or any other markers. And yet, consistently, I ask them if they put away their phones during dinner or until after they’ve bathed and they look at me like I’m an alien. Teenagers loathe hypocrisy. It’s the biggest thing that I’ve seen to undermine trust between a parent and a child. And boy do they have a lot to say about their parents’ addiction to their phones. Oy vay.

So if you want to curb the usage of your child’s technology use, here’s what I propose: Create a household contract. This is a contract that sets the boundaries for everyone in the house — parents and kids.

Ask your teenage or tween child to write the first draft of the contract, stipulating what they think is appropriate as the rules for everyone in the house, what they’re willing to trade-off to get technology privileges and what they think that parents should trade-off. Ask them to list the consequences of not abiding by the household rules for everyone in the house. (As a parent, you can think through or sketch the terms you think are fair, but you should not present them first.). Ask your child to pitch to you what the household rules should be. You will most likely be shocked that they’re stricter and more structured than you expected. And then start the negotiation process. You may want to argue that you should have the right to look at the phone when it’s ringing in case it’s grandma calling, but then your daughter should have the right to look at her phone to see if her best friend is looking. That kind of thing. Work through the process, but have your child lead it rather than you dictate it. And then write up those rules and hang them up in the house as a contract that can be renegotiated at different types.

Parenting Past Addiction

Many people have unhealthy habits and dynamics in their life. Some are rooted in physical addiction. Others are habitual or psychological crutches. But across that spectrum, most people are aware of when something that they’re doing isn’t healthy. They may not be able to stop. Or they may not want to stop. Untangling that is part of the challenge. When you feel as though your child has an unhealthy relationship with technology (or anything else in their life), you need to start by asking if they see this the same way you do. When parents feel as though what their child is doing is unhealthy for them, but the child does not, the intervention has to be quite different than when the child is also concerned about the issue. There are plenty of teens out there that know their psychological desire to talk non-stop with their friends for fear of missing out is putting them in a bad place. Help them through that process and work through what strategies they can develop and learn to cope. Helping them build those coping skills long term will help them a lot more than just putting rules into place.

When there is a disconnect between parent and child’s views on a situation, the best thing a parent can do is try to understand why the disconnect exists.Is it about pleasure seeking? Is it about fear of missing out? Is it about the emotional bond of friendship? Is it about a parent’s priorities being at odds with a child’s priorities? What comes next is fundamentally about values in parenting. Some parents believe that they are the masters of the house and their demands rule the day. Others acquiesce to their children’s desires with no push back. The majority of the parents are in-between. But at the end of the day, parenting is about helping children navigate the world and support them to develop agency in a healthy manner. So I would strongly recommend that parents focus their energies on negotiating a path through that allows children to be bought-in and aware of why boundaries are being set. That requires communication and energy, not a new technology to police boundaries for you. More often than not, the latter sends the wrong message and backfires, not unlike the Black Mirror episode I mentioned earlier.

Good luck parents — parenting is a non-stop adventure filled with both joy and anxiety.

Beyond the Rhetoric of Algorithmic Solutionism

(This was originally posted on Medium)

If you ever hear that implementing algorithmic decision-making tools to enable social services or other high stakes government decision-making will increase efficiency or reduce the cost to taxpayers, know that you’re being lied to. When implemented ethically, these systems cost more. And they should.

Whether we’re talking about judicial decision making (e.g., “risk assessment scoring”) or modeling who is at risk for homelessness, algorithmic systems don’t simply cost money to implement. They cost money to maintain. They cost money to audit. They cost money to evolve with the domain that they’re designed to serve. They cost money to train their users to use the data responsiblyAbove all, they make visible the brutal pain points and root causes in existing systems that require an increase of services.

Otherwise, all that these systems are doing is helping divert taxpayer money from direct services, to lining the pockets of for-profit entities under the illusion of helping people. Worse, they’re helping usher in a diversion of liability because time and time again, those in powerful positions blame the algorithms.

This doesn’t mean that these tools can’t be used responsibly. They can. And they should. The insights that large-scale data analysis can offer is inspiring. The opportunity to help people by understanding the complex interplay of contextual information is invigorating. Any social scientist with a heart desperately wants to understand how to relieve inequality and create a more fair and equitable system. So of course there’s a desire to jump in and try to make sense of the data out there to make a difference in people’s lives. But to treat data analysis as a savior to a broken system is woefully naive.

Doing so obfuscates the financial incentives of those who are building these services, the deterministic rhetoric that they use to justify their implementation, the opacity that results from having non-technical actors try to understand technical jiu-jitsu, and the stark reality of how technology is used as a political bludgeoning tool. Even more frustratingly, what data analysis does well is open up opportunities for experimentation and deeper explorationBut in a zero-sum context, that means that the resources to do something about the information that is learned is siphoned off to the technology. And, worse, because the technology is supposed to save money, there is no budget for using that data to actually help people. Instead,technology becomes a mirage. Not because the technology is inherently bad, but because of how it is deployed and used.

READ THIS BOOK!

Next week, a new book that shows the true cost of these systems is being published. Virginia Eubanks’ book“Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor” is a deeply researched accounting of how algorithmic tools are integrated into services for welfare, homelessness, and child protection. Eubanks goes deep with the people and families who are targets of these systems, telling their stories and experiences in rich detail. Further, drawing on interviews with social services clients and service providers alongside the information provided by technology vendors and government officials, Eubanks offers a clear portrait of just how algorithmic systems actually play out on the ground, despite all of the hope that goes into their implementation.

Eubanks eschews the term “ethnography” because she argues that this book is immersive journalism, not ethnography. Yet, from my perspective as a scholar and a reader, this is the best ethnography I’ve read in yearsAutomating Inequality” does exactly what a good ethnography should do — it offers a compelling account of the cultural logics surrounding a particular dynamic, and invites the reader to truly grok what’s at stake through the eyes of a diverse array of relevant people. Eubanks brings you into the world of technologically mediated social services and helps you see what this really looks like on the ground. She showcases the frustration and anxiety that these implementations produce; the ways in which both social services recipientsand taxpayers are screwed by the false promises of these technologiesShe makes visible the politics and the stakes, the costs and the hope. Above all, she brings the reader into the stark and troubling reality of what it really means to be poor in America today.

“Automating Inequality” is on par with Barbara Ehrenreich’s “Nickel and Dimed” or Matthew Desmond’s “Evicted. It’s rigorously researched, phenomenally accessible, and utterly humbling. While there are a lot of important books that touch on the costs and consequences of technology through case studies and well-reasoned logic, this book is the first one that I’ve read that really pulls you into the world of algorithmic decision-making and inequality, like a good ethnography should.

I don’t know how Eubanks chose her title, but one of the subtle things about her choice is that she’s (unintentionally?) offering a fantastic backronym for AI. Rather than thinking of AI as “artificial intelligence,” Eubanks effectively builds the case for how we should think that AI often means “automating inequality” in practice.

This book should be mandatory for anyone who works in social services, government, or the technology sector because it forces you to really think about what algorithmic decision-making tools are doing to our public sector, and the costs that this has on the people that are supposedly being served. It’s also essential reading for taxpayers and voters who need to understand why technology is not the panacea that it’s often purported to be. Or rather, how capitalizing on the benefits of technology will require serious investment and a deep commitment to improving the quality of social services, rather than a tax cut.

Please please please read this book. It’s too important not to.

Data & Society will also be hosting Virginia Eubanks to talk about her book on January 17th at 4PM ET. She will be in conversation with Julia Angwin and Alondra Nelson. The event is sold out, but it will be livestreamed online. Please feel free to join us there!