Tag Archives: privacy

Deception + fear + humiliation != education

I hate fear-based approaches to education. I grew up on the “this is your brain on drugs” messages and watched classmates go from being afraid of drugs to trying marijuana to deciding that all of the messages about drugs were idiotic. (Crystal meth and marijuana shouldn’t be in the same category.) Much to my frustration, adults keep turning to fear to “educate” the kids with complete disregard to the unintended consequences of this approach. Sometimes, it’s even worse. I recently received an email from a friend of mine (Chloe Cockburn) discussing an issue brought before the ACLU. She gave me permission to share this with you:

A campus police officer has been offering programs about the dangers inherent in using the internet to middle and high school assemblies. As part of her presentation she displays pictures that students have posted on their Facebook pages. The idea is to demonstrate that anyone can have access to this information, so be careful. She gains access to the students’ Facebook pages by creating false profiles claiming to be a student at the school and asking to be “friended”, evidently in violation of Facebook policy.

An ACLU affiliate received a complaint from a student at a small rural high school. The entire assembly was shown a photo of her holding a beer. The picture was not on the complainant’s Facebook page, but on one belonging to a friend of hers, who allowed access to the bogus profile created by the police officer. The complainant was not “punished” as the plaintiff above was, but she was humiliated, and she is afraid that she will not get some local scholarship aid as a result.

So here we have a police officer intentionally violating Facebook’s policy and creating a deceptive profile to entrap teenagers and humiliate them to “teach them a lesson”??? Unethical acts + deception + fear + humiliation != education. This. Makes. Me. Want. To. Scream.

Pew Research confirms that youth care about their reputation

In today’s discussions about privacy, “youth don’t care about privacy” is an irritating but popular myth. Embedded in this rhetoric is the belief that youth are reckless risk-takers who don’t care about the consequences of their actions. This couldn’t be further from the truth.

In my own work, I’ve found that teenagers care deeply about privacy in that they care about knowing how information flows and wanting influence over it. They care deeply about their reputation and leverage the tools available to help shape who they are. Of course, reputation and privacy always come back to audience. And audience is where we continuously misunderstand teenagers. They want to make sure that people they respect or admire think highly of them. But this doesn’t always mean that they care about how YOU think about them. So a teenager may be willing to sully their reputation as their parents see it if it gives them street cred that makes them cool amongst their peers. This is why reputation is so messy. There’s no universal reputation, no universal self-presentation. It’s always about audience.

The teenagers that I first started interviewing in 2004 are now young adults. Many are in college or in the army and their views on their reputation have matured. How they think about privacy and information flow has also matured. They’re thinking about a broader world. At the same time, they’re doing so having developed an understanding of these challenges through their engagement with social media. Are their ideas about these technologies perfect? Of course not. But they’re a whole lot more nuanced than those of most adults that I talk with.

Earlier today, Pew Research Center’s Internet and American Life Project released a report entitled “Reputation, Management, and Social Media” which includes a slew of data that might seem counter-intuitive to adults who have really skewed mythical views of youth and young adults. They found that young adults are more actively engaged in managing what they share online than older adults. In fact, 71% of the 18-29s interviewed in August-September of 2009 who use social network sites reported having changed their privacy settings (vs. 55% of those 50-64). Think about that. This was before Time Magazine put privacy on their front page.

Now, let’s be clear… Young adults are actively engaged in managing their reputation but they’re not always successful. The tools are confusing and companies continue to expose them without them understanding what’s happening. But the fact that they go out of their way to try to shape their information is important. It signals very clearly that young adults care deeply about information flow and reputation.

Reputation matters. This is why Pew found that 47% of 18-29s delete comments made by others on their profiles (vs. 29% of 30-49s and 26% of 50-64s). Likewise, 41% of them remove their name from photos (vs. 24% of 30-49s and 18% of 50-64s). While Pew didn’t collect data on those under 18, I’d expect that this age-wise trend would continue into that age bracket. Much of this is because of digital literacy – the younger folks understand the controls better than the older folks AND they understand the implications better. We spend a lot more time telling teenagers and young adults that there are consequences to reputation when information is put up online than we do listening to ourselves. This is also because, as always, youth are learning the hard way. As Pew notes, young adults have made mistakes that they regret. They’ve also seen their friends make mistakes that they regret. All of this leads to greater consciousness about these issues and a deeper level of engagement.

As always, this Pew report is filled to the brim with useful information that gives us a sense of what’s going on. Here are some of my favorite bullet points:

  • Young adults are still more likely than older users to say they limit the amount of information available about them online.
  • Those who know more, worry more. And those who express concern are twice as likely to say they take steps to limit the amount of information available about them online.
  • The most visible and engaged internet users are also most active in limiting the information connected to their names online.
  • The more you see footprints left by others, the more likely you are to limit your own.
  • Those who take steps to limit the information about them online are less likely to post comments online using their real name.
  • More than half of social networking users (56%) have “unfriended” others in their network.
  • Just because we’re friends doesn’t mean I’m listening: 41% of social networking users say they filter updates posted by some of their friends.
  • Young adult users of social networking sites report the lowest levels of trust in them.

This Pew report does more than inform us about privacy and reputation issues. Its data sends an important message: We need more literacy about these issues. Ironically, I think that the best thing that’s going to come about because of Facebook’s ongoing screw-ups is an increased awareness of privacy issues. When youth see that they can do one of two things with their interests: delete them or make them publicly visible to everyone, they’re going to think twice. Sure, many will still make a lot of that content publicly accessible. And others will be very angry at Facebook for not giving them a meaningful choice. But this is going to force people to think about these issues. And the more people think about it, the more they actively try to control what’s going on. (Of course, we need Facebook to stop taking controls away from people, but that’s a different story.)

Pew’s report also counters a lot of myths that I’ve been hearing. For example, the desire for anonymity isn’t dead. Facebook tends to proudly announce that its users are completely honest about their names. Guess what? Many youth don’t trust Facebook. And they’re not providing them with real names either. Just take a look at this screen shot that I grabbed from a publicly accessible Facebook profile. This image isn’t doctored and while some of the names reflect real ones, there’s a lot of obscuring going on.

If you care about youth, if you care about issues of privacy and reputation, PLEASE read the Pew report. It is an example of brilliant research and tremendous reporting.

Quitting Facebook is pointless; challenging them to do better is not

I’ve been critiquing moves made by Facebook for a long time and I’m pretty used to them being misinterpreted. When I lamented the development of the News Feed, many people believed that I thought that the technology was a failure and that it wouldn’t be popular. This was patently untrue. I was bothered by it precisely because I knew that it would be popular, precisely because people love to gossip and learn about others, often to their own detriment. It was hugely disruptive and, when it launched, users lacked the controls necessary to really manage the situation effectively. Facebook responded with controls and people were able to find a way of engaging with Facebook with the News Feed as a given. But people were harmed in the transition.

Last week, I offered two different critiques of the moves made by Facebook, following up on my SXSW talk. Both have been misinterpreted in fascinating ways. Even news agencies are publishing statements like: “Microsoft wants Facebook to be regulated as a utility.” WTF? Seriously? Le sigh. (For the record, I’m not speaking on behalf of my employer nor do I want regulation; I think that it’s inevitable and I think that we need to contend with it. Oh, and I don’t think that the regulation that we’ll see will at all resemble the ways in which utilities are regulated. I was talking about utilities because that’s how Facebook frames itself. But clearly, most folks missed that.) Misinterpretations are frustrating because they make me feel as though I’m doing a bad job of communicating what I think is important. For this, I apologize to all of you. I will try to do better.

With this backdrop in mind, I want to enumerate six beliefs that I have that I want to flesh out in this post in light of discussions about how “everyone” is leaving Facebook:

  1. I do not believe that people will (or should) leave Facebook because of privacy issues.
  2. I do not believe that the tech elites who are publicly leaving Facebook will affect on the company’s numbers; they are unrepresentative and were not central users in the first place.
  3. I do not believe that an alternative will emerge in the next 2-5 years that will “replace” Facebook in any meaningful sense.
  4. I believe that Facebook will get regulated and I would like to see an open discussion of what this means and what form this takes.
  5. I believe that a significant minority of users are at risk because of decisions Facebook has made and I think that those of us who aren’t owe it to those who are to work through these issues.
  6. I believe that Facebook needs to start a public dialogue with users and those who are concerned ASAP (and Elliot Schrage’s Q&A doesn’t count).

As I stated in my last post, I think that Facebook plays a central role in the lives of many and I think that it is unreasonable for anyone to argue that they should “just leave” if they’re not happy. This is like saying that people should just leave their apartments if they’re not happy with their landlord or just leave their spouse because they’re not happy with a decision or just leave their job if they’re not happy with their boss. Life is more complicated than a series of simplified choices and we are always making calculated decisions, balancing costs and benefits. We stay with our jobs, apartments, and spouses even when things get messy because we hope to rectify problems. And those with the most to gain from Facebook are the least likely to leave, even if they also have the most to lose.

In the last few weeks, a handful of well known digerati have proudly announced that they’ve departed from Facebook. Most of these individuals weren’t that engaged in Facebook as users in the first place. I say this as someone who would lose very little (outside of research knowledge) from leaving. I am not a representative user. I barely share on the site for a whole host of personal and professional reasons. (And because I don’t have a life.) None of my friends would miss me if I did leave. In fact, they’d probably be grateful for the disappearance of my tweets. That means that me deciding to leave will have pretty much no impact on the network. This is true for many of the people who I’ve watched depart. At best, they’re content broadcasters. But people have other ways of consuming their broadcasting. So their departure is meaningless. These are not the people that Facebook is worried about losing.

People will not leave Facebook en masse, even if a new site were to emerge. Realistically, if that were enough, they could go to MySpace or Orkut or Friendster or Tribe. But they won’t. And not just because those sites are no longer “cool.” They won’t because they’ve invested in Facebook and they’re still hoping that Facebook will get its act together. Changing services is costly, just like moving apartments or changing jobs or breaking up in general. The deeper the relationship, the harder it is to simply walk away. And the relationship that Facebook has built with many of its users is very very very deep. When transition costs are high, people work hard to change the situation so that they don’t have to transition. This is why people are complaining, this is why they are speaking up. And it’s really important that those in power listen to what it is that people are upset about. The worst thing that those in power can do is ignore what’s going on, waiting for it to go away. This is a bad idea, not because people will walk away, but because they will look to greater authorities of power to push back. This is why Facebook’s failure to address what’s going on invites regulation.

Facebook has gotten quite accustomed to upset users. In “The Facebook Effect,” David Kirkpatrick outlines how Facebook came to expect that every little tweak would set off an internal rebellion. He documented how most of the members of the group “I AUTOMATICALLY HATE THE NEW FACEBOOK HOME PAGE” were employees of Facebook whose frustration with user rebellion was summed up by the group’s description: “I HATE CHANGE AND EVERYTHING ASSOCIATED WITH IT. I WANT EVERYTHING TO REMAIN STATIC THROUGHOUT MY ENTIRE LIFE.” Kirkpatrick quotes Zuckerberg as saying, “The biggest thing is going to be leading the user base through the changes that need to continue to happen… Whenever we roll out any major product there’s some sort of backlash.” Unfortunately, Facebook has become so numb to user complaints that it doesn’t see the different flavors of them any longer.

What’s happening around privacy is not simply user backlash. In fact, users are far less upset about what’s going on than most of us privileged techno-elites. Why? Because even with the New York Times writing article after article, most users have no idea what’s happening. I’m reminded of this every time that I sit down with someone who doesn’t run in my tech circles. And I’m reminded that they care every time I sit down and walk them through their privacy settings. The disconnect between average users and the elite is what makes this situation different, what makes this issue messier. Because the issue comes down to corporate transparency, informed consent, and choice. As long as users believe that their content is private and have no idea how public it is, they won’t take to the streets. A disappearance of publicity for these issues is to Facebook’s advantage. But it’s not to user’s advantage. Which is precisely why I think that it’s important that the techno-elite and the bloggers and the journalists keep covering this topic. Because it’s important that more people are aware of what’s going on. Unfortunately, of course, we also have to contend with the fact that most people being screwed don’t speak English and have no idea this conversation is even happening. Especially when privacy features are only explained in English.

In documenting Zuckerberg’s attitudes about transparency, Kirkpatrick sheds light on one of the weaknesses of his philosophy: Zuckerberg doesn’t know how to resolve the positive (and in his head inevitable) outcomes of transparency with the possible challenges of surveillance. As is typical in the American tech world, most of the conversation about surveillance centers on the government. But Kirkpatrick highlights another outcome of surveillance with a throwaway example that sends shivers down my spine: “When a father in Saudi Arabia caught his daughter interacting with men on Facebook, he killed her.” This is precisely the kind of unintended consequence that motivates me to speak loudly even though I’m privileged enough to not face these risks. Statistically, death is an unlikely outcome of surveillance. But there are many other kinds of side effects that are more common and also disturbing: losing one’s job, losing one’s health insurance, losing one’s parental rights, losing one’s relationships, etc. Sometimes, these losses will be because visibility makes someone more accountable. But sometimes this will occur because of misinterpretation and/or overreaction. And the examples keep on coming.

I am all in favor of people building what they believe to be alternatives to Facebook. I even invested in Diaspora because I’m curious what will come of that system. But I don’t believe that Diaspora is a Facebook killer. I do believe that there is a potential for Diaspora to do something interesting that will play a different role in the ecosystem and I look forward to seeing what they develop. I’m also curious about the future of peer-to-peer systems in light of the move towards the cloud, but I’m not convinced that decentralization is a panacea to all of our contemporary woes. Realistically, I don’t think that most users around the globe will find a peer-to-peer solution worth the hassle. The cost/benefit analysis isn’t in their favor. I’m also patently afraid that a system like Diaspora will be quickly leveraged for child pornography and other more problematic uses that tend to emerge when there isn’t a centralized control system. But innovation is important and I’m excited that a group of deeply passionate developers are being given a chance to see what they can pull off. And maybe it’ll be even more fabulous than we can possibly imagine, but I’d bet a lot of money that it won’t put a dent into Facebook. Alternatives aren’t the point.

Facebook has embedded itself pretty deeply into the ecosystem, into the hearts and minds of average people. They love the technology, but they’re not necessarily prepared for where the company is taking them. And while I’m all in favor of giving users the choice to embrace the opportunities and potential of being highly visible, of being a part of a transparent society, I’m not OK with throwing them off the boat just to see if they can swim. Fundamentally, my disagreement with Facebook’s approach to these matters is a philosophical one. Do I want to create more empathy, more tolerance in a global era? Of course. But I’m not convinced that sudden exposure to the world at large gets people there and I genuinely fear that possible backlash that can emerge. I’m not convinced that this won’t enhance a type of extremism that is manifesting around the globe as we speak.

Screaming about the end of Facebook is futile. And I think that folks are wasting a lot of energy telling others to quit or boycott to send a message. Doing so will do no such thing. It’ll just make us technophiles look like we’re living on a different planet. Which we are. Instead, I think that we should all be working to help people understand what’s going on. I love using Reclaim Privacy to walk through privacy settings with people. While you’re helping your family and friends understand their settings, talk to them and record their stories. I want to hear average people’s stories, their fears, their passions. I want to hear what privacy means to them and why they care about it. I want to hear about the upside and downside of visibility and the challenges introduced by exposure. And I want folks inside Facebook to listen. Not because this is another user rebellion, but because Facebook’s decisions shape the dynamics of so many people’s lives. And we need to help make those voices heard.

I also want us techno-elites to think hard and deep about the role that regulation may play and what the consequences may be for all of us. In thinking about regulation, always keep Larry Lessig’s arguments in “Code” in mind. Larry argued that there are four points of regulation for all change: the market, the law, social norms, and architecture (or code). Facebook’s argument is that social norms have changed so dramatically that what they’re doing with code aligns with the people (and conveniently the market). I would argue that they’re misreading social norms but there’s no doubt that the market and code work in their favor. This is precisely why I think that law will get involved and I believe that legal regulators don’t share Facebook’s attitudes about social norms. This is not a question of if but a question of when, in what form, and at what cost. And I think that all of us who are living and breathing this space should speak up about how we think this should play out because if we just pretend like it won’t happen, not only are we fooling ourselves, but we’re missing an opportunity to shape the future.

I realize that Elliot Schrage attempted to communicate with the public through his NYTimes responses. And I believe that he failed. But I’m still confused about why Zuckerberg isn’t engaging publicly about these issues. (A letter to Robert Scoble doesn’t count.) In each major shitstorm, we eventually got a blog post from Zuckerberg outlining his views. Why haven’t we received one of those? Why is the company so silent on these matters? In inviting the users to vote on the changes to the Terms of Service, Facebook mapped out the possibility of networked engagement, of inviting passionate users to speak back and actively listening. This was a huge success for Facebook. Why aren’t they doing this now? I find the silence to be quite eerie. I cannot imagine that Facebook isn’t listening. So, Facebook, if you are listening, please start a dialogue with the public. Please be transparent if you’re asking us to be. And please start now, not when you’ve got a new set of features ready.

Regardless of how the digerati feel about Facebook, millions of average people are deeply wedded to the site. They won’t leave because the cost/benefit ratio is still in their favor. But that doesn’t mean that they aren’t suffering because of decisions being made about them and for them. What’s at stake now is not whether or not Facebook will become passe, but whether or not Facebook will become evil. I think that we owe it to the users to challenge Facebook to live up to a higher standard, regardless of what we as individuals may gain or lose from their choices. And we owe it to ourselves to make sure that everyone is informed and actively engaged in a discussion about the future of privacy. Zuckerberg is right: “Given that the world is moving towards more sharing of information, making sure that it happens in a bottom-up way, with people inputting their information themselves and having control over how their information interacts with the system, as opposed to a centralized way, through it being tracked in some surveillance system. I think it’s critical for the world.” Now, let’s hold him to it.

Update: Let me be clear… Anyone who wants to leave Facebook is more than welcome to do so. Participation is about choice. But to assume that there will be a mass departure is naive. And to assume that a personal boycott will have a huge impact is also naive. But if it’s not working for you personally, leave. And if you don’t think it’s healthy for your friends to participate, encourage them to do so too. Just do expect a mass exodus to fix the problems that we’re facing.

Update: Mark Zuckerberg wrote an op-ed in the Washington Post reiterating their goals and saying that changes will be coming. I wish he would’ve apologized for December or made any allusions to the fact that people were exposed or that they simply can’t turn off all that is now public. It’s not just about simplifying the available controls.

Translations:

Public by Default, Private when Necessary

This post was originally written for the DML Central Blog. If you’re interested in Digital Media and Learning, you definitely want to check this blog out.

With Facebook systematically dismantling its revered privacy infrastructure, I think it’s important to drill down on the issue of privacy as it relates to teens. There’s an assumption that teens don’t care about privacy but this is completely inaccurate. Teens care deeply about privacy, but their conceptualization of what this means may not make sense in a setting where privacy settings are a binary. What teens care about is the ability to control information as it flows and to have the information necessary to adjust to a situation when information flows too far or in unexpected ways. When teens argue that they produce content that is “public by default, private when necessary,” they aren’t arguing that privacy is disappearing. Instead, they are highlighting that both privacy AND publicity have value. Privacy is important in certain situations – to not offend, to share something intimate, or to exclude certain people. Yet, publicity can also be super useful. It’s about being present in social situations, about chance encounters, about obtaining social status.

Once upon a time on Facebook, participants had to be a vetted member of a community to even have an account. Privacy was a deeply held value and many turned to Facebook because of the ways in which it protected them from making public mistakes. This was especially core to youth participation. Parents respected Facebook’s attitudes towards privacy and, in a shocking moment of agreement, teens did too.

Slowly, things have changed. Most recently, Facebook made it possible for users to make Facebook content public (presumably to compete with Twitter). When participants signed in, they were asked whether or not they wanted they wanted to change their privacy settings. Many were confused and just clicked through, not realizing that this made their content more public than it was before. This upset some legal types and Facebook was forced to retreat, making the status quo the default instead of tricking folks into being public.

Recently, Facebook’s CEO Mark Zuckerberg made comments that amount to “the age of privacy is over” as justification for why the company has decided to get with the times and make things more public. This prompted me to rant about Facebook’s decision.

Social media has enabled new forms of publicity, structures that allow people to connect as widely as they can build an audience. Teens are embracing this to do all sorts of powerful things. But they aren’t doing so to eschew privacy. They are still keeping intimate things close to their hearts or trying to share content with narrow groups of people. It’s just that, in many situations, there is more to be gained by accepting the public default than by going out of one’s way to keep things private. And here’s where we see the shift. It used to take effort to be public. Today, it often takes effort to be private.

While Facebook has justified its decisions by citing shifts in societal expectations, they are doing a disservice to those who value Facebook precisely because of its culture of keeping things more close. It’s not so much that posting things on Facebook was ever private; no teen sees the Wall as a private space. It’s that the default was not persistent, searchable, and scaled to a mass degree. Just because teens choose to share some content widely does not mean that they wish all content could be universally accessible. What they want is a sense of control. And what Facebook is doing is destabilizing the system in a way that complicates control, especially for teens who are most vulnerable of having content go down on their permanent record.

Twitter is for friends; Facebook is everybody

I was talking with a friend of mine today who is a senior at a technology-centered high school in California. Dylan Field and his friends are by no means representative of US teens but I always love his perspective on tech practices (in part cuz Dylan works for O’Reilly and really thinks deeply about these things). Noodling around, I asked him if many of his friends from his school used Twitter and his response is priceless:

Dylan: “as for twitter, we are totally not representative, but ya a lot of people use twitter. it’s funny because the way they are using it is not the way most do… they make private accounts and little sub-communities form. like cliques, basically. so they can post stuff they don’t want people on fb to see, since fb is everybody. it’s odd, because the way i see it get used with my friends is totally contradictory to what everyone is saying. people seem to think teens hate twitter because it’s totally public, but the converse is actually true. but it’s not everyone… probably 10-15% at most.”

As someone who has argued about the challenge of Twitter being public (to all who hold power over teens), I find this push-back to be extremely valuable. What Dylan is pointing out is that the issue is that Facebook is public (to everyone who matters) and Twitter can be private because of the combination of tools AND the fact that it’s not broadly popular.

My guess is that if Twitter does take off among teens and Dylan’s friends feel pressured to let peers and parents and everyone else follow them, the same problem will arise and Twitter will become public in the same sense as Facebook. This of course raises a critical question: will teens continue to be passionate about systems that become “public” (to all that matter) simply because there’s social pressure to connect to “everyone”?