First, I commend Facebook for taking child safety seriously. When I was working with them as part of the Internet Safety Technical Task Force, I was always impressed by their concern. I think that there’s a mistaken belief that Facebook doesn’t care about child safety. This was the message that many propagated when Facebook balked at implementing the “Panic Button” in the UK. As many news articles recently reported, Facebook finally conceded last week to implementing it after enormous pressure by safety advocates. Their slowness in agreeing to do so was attributed to their lack of caring, but this is simply not true. There are actually very good reasons to be wary of the “Panic Button.” My fear is that the lack of critical conversation about the “Panic Button” will result in people thinking it’s a panacea, rather than acknowledging its limitations and failings. Furthermore, touting it as a solution obscures the actual dangers that youth face.
The “Panic Button” is actually an App called “ClickCEOP”. Users must add the App and then they get a tab so that there’s a button there whenever they need to talk to the police’s Child Exploitation and Online Protection Centre. They’re encouraged to share the badge as a way of protecting their friends.
Pressure to create the “Panic Button” came after the horrific murder of 17-year-old Ashleigh Hall by a 33-year-old serial sex offender named Peter Chapman who approached the teen on Facebook. Reports suggest that he told her he was a teenage boy, although she also knew that none of her friends knew him. She lied to her parents to leave the house to meet up with him at which point he abducted, raped, and murdered her. Why she started conversing with him is unknown. Why – after being convicted of other sex crimes against minors – he was out on the streets and not being monitored is also unclear. All that is known is that this is the kind of tragedy that constitutes every parent’s worst nightmare.
Safety advocates used Hall’s terrible death to rally for a Panic Button. But what would this have done? She was clearly willing to converse with him and had no reservations about meeting up with him. None of her friends knew she was conversing with him. Nor did her parents. The heartbreaking reality of most rape and murder cases of this type is that the teen knowingly meets up with these men. When it involves teens, it’s usually because they believe that they’re in love, value the attention, and are not even thinking about the risks. Online Panic Buttons do absolutely nothing to combat the fundamental challenge of helping youth understand why such encounters are risky.
CEOP invites people to implement the ClickCEOP tab with the following questions:
Do you sometimes struggle to find answers to things that worry you online?
Had bad wall posts from people you don’t know?
Had a chat conversation that went sour?
Seen something written about you that isn’t true, or worse?
Has your account ever been hacked, even just as a joke?
These are serious questions and serious issues, the heart of bullying. They aren’t really about predation, but that doesn’t make them any less important. That said, how can the police help with every teen who is struggling with the wide range of bullying implied, from teasing to harassment? Even if every teen in the UK were to seriously add this and take it seriously, there’s no way that the UK police have a fraction of the resources to help teens manage challenging social dynamics. As a result, what false promises are getting made?
Many of the teens that I encounter truly need help. They need supportive people in their lives to turn to. I desperately want to see social services engage with these youth. But what I find over and over again is that social services do not have the resources to help even a fraction of the youth that come to them. So when we create a system where we tell youth that they have an outlet and then they use it and we don’t live up to our side of the bargain, then what? Many of the teens that I interviewed told me of their efforts to report problems to teachers, guidance counselors, parents, etc. only to no avail. That left them feeling more helpless and alone. What’s the risk of CEOP doing this to youth?
Finally, what’s the likelihood that kids (or adults) will click on this as a joke or just to get attention? How is CEOP going to handle the joke clicks vs. the real ones? How will they discern? One thing you learn from dealing with helplines is that kids often call in to talk about their friends when they’re really looking for help for themselves. It’s easier to externalize first to test the waters. The CEOP may get prank messages that are real cries for help. What happens when those go unanswered?
The press are all reporting this as being a solution to predation, but the teens who are at-risk for dangerous sexual encounters with adults are not going to click a Panic Button because they think that they know what they’re doing. CEOP is advertising this as a tool for bullying, but it’s not clear to me that they have the resources (or, for that matter, skillset) to handle the mental health issues they’re bound to receive on that end. And users may use this for a whole host of things for which it was never designed. The result, if anyone implements it at all, could be a complete disaster.
So why do I care that another well-intentioned technology is out there and will likely result in no change? I care because we need change. We need to help at-risk youth. And a tremendous amount of effort and energy is being expended to implement something that is unlikely to work but makes everyone feel as though the problem is solved. And I fear that there will be calls in the US to implement this without anyone ever evaluating the effectiveness of such an effort in the UK. So I respect Facebook’s resistance because I do think that they fully understand that this won’t help the most needy members of their community. And I think that the hype around it lets people forget about those most at-risk.
To my friends across the pond… Please help evaluate this “solution.” Please tell us what is and is not working, what kinds of cases the CEOP receives and how they handle them. Any data you can get your hands on would be greatly appreciated. I realize that it’s too late to stop this effort, but I really hope that people are willing to treat it as an experiment that must be evaluated. Cuz goddess knows the kids need us.
Another thoughtful analysis by Danah B. Bravo. there is no technological solution to predation. The closest thing I can think of is paralel monitored accounts for minors – accessed by parents, if the parents are willing and able, which is another thing. So these would be mirrored accounts. Now, we have children at very early ages that far outpace parents in technical knowledge of on-line media. What to do? There are heuristic methods to detect certain inappropriate exchanges and language markers, but….oy! An overriding sentinel application could be designed to monitor all of a minor’s activity and only step in when it resolves the ID of the incoming, its intent, and what markers in the conversation are risk indicating. I believe companies like Attensity have configurable clientware like this, or it could be server based.
I still think that although there will be much tech fired at the target, it will be an elusive solution as long as children at risk are so tech savvy, or just savvy enough to get and make an errant connection to a dangerous monster.
Thank you, danah. Your work and voice in this area are so crucial. Congrats, also, on Fortune naming you Smartest (Tech) Academic http://bit.ly/d7xlbR It’s a well-deserved and earned title.
REALLY interesting blog, as usual. I believe that CEOP is run by the police alongside the NSPCC. As far as their capacity to handle some of the arising issues goes, they are, in theory at least, well equipped.
Of course there are many other things raised here about an overwhelming issue and many things that, as you say, are worth evaluation. Thanks again for your work.
I applaud your comments Danah.
My view has always been that the CEOP button is useful for those who receive a solicitation of some kind and are frightened by it. That is it’s primary function with a secondary being as a deterrent merely by existing. Unfortunately, as you have quite correctly pointed out, only those who choose to use it (i.e. have the awareness to be frightened) can be protected by it. When it is used correctly it is a very powerful tool and CEOP do point to many successes as a result of interventions they have made after being alerted.
The real answer to this problem lies in a blend of knowledge and tools. Parents and carers need to take the primary role in safeguarding their children. They can only do this by knowing about the dangers – what they are, how to recognise them, how to mitigate them and as a last recourse how to deal with situations where the danger has become real. This knowledge is not readily available to them so they start with one hand tied.
The reason parents and carers have the primary role is that while their children are very often technically savvy they are normally not socially aware enough to recognise dangers without guidance. This social awareness only comes with age and cannot be taught at a pace beyond the mental development of the child or teen.
The CEOP button cannot replace awareness but can be of immense value once education and open dialogue have made the child aware enough to feel uncomfortable with whatever is being said or suggested (this requires an ongoing open dailogue between parent/carer and the child).
So is the CEOP button a global cure? – absolutely not but it does have a role in the overall scheme of things as long as it is seen as part of that blended solution and not as the solution.
Thank you for saying (among other things) “HE abducted, raped, and murdered her.”
There are no good solutions that socnets are offering to this issue, just reluctance, or active opposition. Instead of actively looking at solutions and even trying some pilots, the largest websites in the world fail to do anything to protect at-risk or even not at-risk kids. Therefore CEOPs has to introduce a button app. Yes it should talk about unwarranted sexual interaction, and actually help to educate teens that just because people friend you, it does not mean that they are your friend. Secondly, they can establish some type of parental or adult authorization for teen accounts and parental controls to enable parental oversight of children’s activities. There is a site called togetherville, that does it fairly well. It requires that an “adult” on facebook authorize an account. Facebook could to this themselves. Now facebook should actually check to make sure that the facebook account is an adult and not on a sex offender list, but that is another issue.
So the CEOP button exists and while not perfect and hard to manage, it is too soon to see if it is valuable or not, but if it deters unwarranted sexual interaction between adults and teens, because a teen or parent reports the activity, or the predator decides to do his activity elsewhere, it has positive value.
Great post indeed: very deep but sensitive, like all the analysis you give everytime.
Can I just ask a technical question? When you say:
“Why she started conversing with him is unknown.” I wonder if the police – as far as you know – after such a crime, can get access to the logs of the events occurred between murderer and victim (chat logs, etc.), and if such logs could be helpful to put a light on the matter. What is your experience in this?
I ask because it may sound obvious, but it’s something I just ignore.
The thing to make clear is that it’s not a “panic button”.
The CEOP Report button leads to a signposting page for children looking for advice about issues facing them online. Much of what is behind the button is advisory and directs children to resources to do viruses, bullying, mobile phones and other issues.
If you have been the subject of inappropriate sexual activity online then this is where CEOP steps in. In terms of dealing with the workflow, CEOP grades each case that is logged and follows up every one, including hoax messages. As a member of the Virtual Global Taskforce (VGT) CEOP can call upon the members, which include Interpol and others to share the load and disseminate worldwide very quickly.
If we could dispose of the media handle of “panic” and realise that it’s a helpful resource with a reporting tool that can be used if necessary then this might make it easier for people to understand why it’s there. You may well have a serious issue that you want to report, but it’s just as much about giving advice to young people.
Finally, CEOP is about educating children, not saying “don’t do it”, but just saying “we know you are going to do it whatever we say, so do it safely, and here’s how”.
Take a look at the Report Button’s destination for yourself at http://www.ceop.gov.uk/reportabuse
From my perspective, we need an army of social workers, therapists, and volunteer 20-somethings dedicated to walking the digital streets. In most cases, we don’t need law enforcement but we do need social services. We need infrastructure that is set up to support that. But this isn’t the kind of thing that a tech company should implement for a whole host of reasons, starting with the fact that engineers should not operate as therapists. I’ve been begging for social services to put in place a reasonable program that would allow them to do meaningful outreach if given access. I’m confident that the social network sites would be more than willing to help social services do their jobs better, just as they work with law enforcement in missing children’s cases and with other youth-oriented legal cases. But whenever I approach social services, I’m told that there aren’t enough resources and there’s no interest in doing things online because there are enough problems in the “real” world. The problems that we’re facing online are an extension of the problems that we face in everyday life. It’s just that online it’s a whole lot more visible. If we focus on the technology, we’ll miss the bigger picture. And we do that over and over and over and over again.
It is typical of Law enforcement and well-meaning safety offices to advocate measures like the Click-Ceop App. We will never know whether this App will be effective – there is no independent oversight/audit.
Why did Ashleigh Hall not use the App on MSN? Would the FB app have made a difference?
Familial abuse and road accidents figure prominently in child fatalities – cannot see much being done here.
Blaming FB is like:
Mr Gamble – should we not ask Parents with children (below the FB TOS age limit) to take some responsibility?
I think the way the button is framed to users could make a big difference in terms of its success. Currently, the way that it’s framed is not usable and is not going to entice youth to participate. In the end, it’s just a button. as Mark says, what’s important is to teach people skills to be safe. So, the goal is to teach youth to be wary to the fact that people misrepresent themselves all the time. the sooner we can teach youth to critical, the better. I don’t see why education, in this case it’s a “help” button, cannot help youth in this situation, especially if the button is framed in a way that is usable and accessible to help youth figure out if they are in danger. Ideally, every person who is under the age of 18 would automatically get a help button, and it could connect to a live person – for any privacy or safety concern they might have. Earning youth’s trust is necessary and framing the button as “police” is not very inviting. Currently, the button’s framed to appeal to adults views of the questions youths should be asking, not the actual concerns youths might have. Craigslist now shows warnings about all types of scams in relevant sections, of course it’s an adult audience, but I think these “Surgeon General Warnings” or whatever you want to call them, which alert consumers in short factual sentence can be effective.
Hi Danah, interesting and insightful write up! Here is one i wrote a few days back.
Keep up the great work 🙂 ~Laura
ps. For me it’s not that Facebook as a corporate entity does or does not “care” it’s about the release of relevant data.
Have just read Larry Magid’s thoughtful take but I am still inclined to follow much of Danah’s thinking. The best way to articulate my sense of why I think the App is not about empowerment/education is this:
What does the App teach users about online safety? None. What it teaches us is still this – fear and mistrust. Given the public dispute between CEOP and FB – a victim approach to online safety tends to inform much of the debate.
The pressure to add this button to social networking sites is fuelled by the culture of fear created by the media around online predators. This is counter productive. Firstly, it’s not the most effective way to deal with the issues of grooming and predators. Secondly it deflects from other key problems such as bullying. As chief of safety at weeworld.com I have been reviewing the CEOP button and it’s pros and cons for some time. WeeWorld’s biggest audience is in the US closely followed by the UK. I’m based in the UK and have been trying to work with CEOP. We need to marry its expertise with the knowledge and experiences of those of us running sites and working within this environment. Only then can we start to address the issues in an effective way. I have a meeting with CEOP next month and am looking forward to making some progress.
Even though i don’t want to belittle the tragic cases of Ashleigh Hall and others, I fear that this topic receives way more media attention than warranted (how many millions of social networking users are there and how many incidents like this happened?).
This panic button is imo just another pr stunt pulled by FB, to counter some of the negative publicity. I really don’t think it will help preventing such tragedies in the future.
I think the real debate (or should I say, the real intelligent debate is one that we have yet to have) and it is this – what risks kids should be permitted to take as part of their personal development and well-being. Note: children playgrounds – and the idea that children’s playgrounds should be spaces supervised and monitored (ie safe play).
We are doing to the digital playground what we have done in analogue spaces.
Strangers are scary but abuse comes mainly from relatives and friends.