Over at Wired, Annalee Newitz’s post entitled MySpace + SecondLife / Ponies!1 = BarbieGirls describes one of the scariest side effects of all of the predator panic. A new site called BarbieGirls has launched for young girls to socialize with other young girls. To handle parental concerns, the site informs parents:
We also monitor chat to help ensure it stays safe and appropriate. Barbie Girls administrators frequently review reports of chatting in the environment and adjust the word filters as needed to block or allow new words or phrases. This monitoring is strictly for the purpose of maintaining a safe chat environment – chat reports are not used in any other way, and we do not save or store any private information.
What does it mean that an entire generation is growing up to believe that the only way to be safe is to be constantly surveilled? ::shudder:: I’m rather concerned about the longterm implications of all of this monitoring and control. Aren’t we supposed to be raising a generation of creatives? Le sigh.
Hey, Danah! Yahoooooooo, I love this blog.
Wouldn’t it be great to believe that kids under 13 could control themselves and self-descriminate? Oh, uber-sigh.
I actually work with kids online, aged 9-13, from a brand perspective. Aside from the need to keep a clean, energetic community (for biz reasons), there are certain social-interactions that kids don’t quite understand. Sure, word filters can help keep the language appropriate. My job is to help ensure that the community stays healthy and positive and safe. You wouldn’t necessarily leave a classroom of 4th graders along in a room with nothing but a video camera, ya know?
Also, there is a big difference between the social capabilities of a 9 year old and a 12 year old… literal versus sarcasm, etc… these are things that can make or break a young community. It’s the whole “take my marbles and go” mentality that we want to avoid– especially for the child’s benefit.
Kids are wonderfully impulsive and reactive and constantly exploring limitations… and they definitely don’t realize that their web-actions can have negative reactions. Sure, they hear it in class, but it hasn’t been actualized yet.
Not to mention, nearly every day I am removing personal information from kids eager to join friend-cliques with other kids– and the older kids (12, 13) are sneeeeaky. And despite the fact I have to be on guard for their sneakiness… MAN, does it crack me up when they try. And ooooh how they try. 😉
I talk with parents and teachers all the time, and I’m always trying to explain the importance of good netizenship and safeguarding personal information. Parents seem to go directly to their own comp. filters to “save the day,” and not really talk directly to the source of the issue– their kids. In your opinion, what are some things I can share with people about web-education for the younger tater tots?
Barbie Girls caters to the COPPA-required 12-and-under demographic. It’s no surprise that they want to keep it clean; COPPA mandates that 12-and-unders get parental permission to sign up for these services, and no parent in their right mind would allow their 8-year-old to sign up if the site allowed bad language and the like (that is, if the parent even knew their child was signing up for such a site).
But I think you’re asking the wrong question. I would say the right question is, “Why did the creators of this site choose this moderation policy?” Every site should have a moderation policy that’s appropriate (and wanted) by their users. For example, Habbo Hotel turns every bad word into “bobba” and if you type in too many, you’ll get a nastygram from their bots warning you. World of Warcraft lets the user choose whether or not they want the filtering. This has nothing to do with teaching kids to evade filters, stunting their creativity, or babysitting them online; this has everything to do with making a site that kicks ass for them.
Speaking for myself and the site I work on, I’m not thinking about the impact on a generation when considering moderation and filtering policies. I am listening to my users. Some of them are not happy with (what they perceive as) bad words or offensive content, and it’s my responsibility to make sure they’re comfortable on my site. If that means creating a piece of software to act as their personal censor, I’ll consider it if it will make my users happy. Doing anything otherwise would be bad for my users and bad for my company.
I find it incredibly naive that, after 10 years of language filtering failures, we still have companies believing that we can control darker emotional outbursts with regular expressions.
If you walk into any online game lobby it doesn’t take long to see the creativity of foul mouthed players. Spellings are changed. Spaces are inserted. Language ITSELF changes in order to avoid censorship and convey the intended meaning.
Attempting to filter these tendencies superficially reassures parents and clears them of any legal ramifications. But it is foolish to think that an authoritarian approach to communication will insulate children from that which they weren’t meant to hear.
>For example, Habbo Hotel turns every bad word into “bobba” and if >you type in too many, you’ll get a nastygram from their bots >warning you.
there’s also ‘Hobbas’ wandering around which are volunteer kids who agree to look out for others. I much prefer this system to the Barbie one – it shows that kids can look after kids rather than having to rely on adults…
When will we learn that the responsibility for protecting children does not belong to the government? It does not belong in the hands of private companies, either. It rests squarely on the shoulders of the parents. Put the damned computer in a place where you will see your child using it, watch and ask questions about where they go and who they talk to. Taking freedoms away from children and teaching them that it is OK to have your privacy invaded for the sake of a fleeting sense of safety is only going to lead to disaster.
There is no magic bullet replacement for responsible parenting…
I second that sigh.
~Jason
This is an interesting debate. My first thought when reading the post was “well, thank goodness they are monitoring it, but who cares about swear words?” My worry is more about the kinds of people who would be attracted to a site full of innocent little girls chatting.
When you talk about a generation of children growing up to think that it’s normal to be surveilled then that’s another, scary, proposition.
I absolutely agree that the responsibility lies with parents, but what do we do about the fact that a lot of parents simply will not monitor their children on the internet?
Here’s the thing… yes, it would be fantastic if parents really took the bull by the horns and helped raise their children to be uber-computer literate.
From a Brand perspective (bear with me now 🙂 ) it gets a little sticky. Wouldn’t you want to trust that the sites your child constantly surfs are doing their best to keep a safe environment? Media responsibility and all that jazz.
My dad is in Insurance and I grew up with dinner-table stories about Mr. “Man” getting sued because a neighborhood child fell on his driveway and broke his arm, or Ms.”Lady” having a child on a bike ride into their PARKED & empty car and having to pay for that child’s hospital bills. We keep talking about how the web is this new world. It’s consequences can be just as dire as IRL. As a brand, wouldn’t you rather just do your best to ensure that little mishaps won’t happen?
Aside from COPPA and bad press, I think it is commendable for companies to help keep their young community safe. Might as well monitor & stop the problems before they happen and not wait for problems to happen and scramble to fix them, right?
Besides– isn’t that part of being a good community? Trying to look out for the less experienced?
Bottom line is this: Yes. Parents need to take a LOT more responsibility for the web-wanderings of their kids. But at least others are taking responsibility for their own sites and ensuring safe, fun environments.
Anyway… just food for thought from someone who is in the trenches every day, and seeing the things tweens write every day 🙂
Just a small reminder that we are discussing the Barbiegirls site. A brand-centric site aimed at 6-9 year old girls. Not only is this demographic more comfortable (both themselves and their parents) in protected environments online, but the brand has certain corporate liabilities to protect.
If we were talking about a fan created site for even older tweens, I would be open to discussing the watch-from-the-sidelines approach. But this scenario, specifically, begs a more hands-on approach to community management.
We need to think critically about the idea of corporate responsibility (and not just the big brother parts of it). We also need to consider that these companies are creating public spaces, with many of the same sorts of issues that offline public spaces have. Hopefully the companies putting up these sorts of areas have thought through this as part of the community strategy (no idea if Mattel/Barbie has).
Mentored (not just policed/monitored) environments for young children are what people seek offline for their kids. Why doesn’t it make sense that they can exist online too. If the corporations are leading the way, (and their motivations are true and good) then this is a thing that should be applauded, not frowned upon. Hopefully, this is what Barbiegirls meant in their disclaimer.
And if the companies need help making sure they have the right intentions, then it’s up to us, the web/community professionals, to help them.
Thanks, Danah, for inspiring this dialogue. 🙂 Joi
(And thank you Joi – i’m enjoying the dialogue myself.)
Perhaps then we should not talk about it as a “private” conversation with “best friends only”? I’m not saying that a free-for-all is the answer… i’m saying that the way that this is constructed is surveillance of purportedly private spaces. And that has long term consequences.
And here’s how they do it over at Nicktropolis. The default chat mode uses pre-written messages only. (This is an innovation even beyond the scope of George Orwell’s “Newspesk” – or at least the logical endpoint of it.) Parents can request an activation code to allow their child access to the “sanitized dictionary” mode in which they can type their own messages subject to a filtered vocabulary.
Note, however, that there appears to be no validation to insure that you really are the parent. The following is an excerpt from the email response accompanying the activation code.
**************************************************
One of the most exciting features of Nicktropolis is that kids can actually communicate safely with each other in real time. Right now, your child is able to choose from a list of messages pre-written by Nickelodeon to securely chat with other kids who have NickNames registerd in Nicktropolis.
By using the activation code below, you can enable your child to communicate in Nicktropolis using Nick.com’s sanitized dictionary specifically screened by Nick.com. To do this, just visit Nicktropolis with your child. Once he or she logs in to Nicktropolis, you can create a parent account attached to your child’s Nicktropolis account. There YOU can choose your child’s level of safe, dictionary chat security.
**************************************************
why when im shopping i cant buy other items or pets like wigs adoppting i cant do it pls help me figure out how we do these!!! i want to buy!!!
Congressman Sam Albert: [On TV] We knew that we had to monitor our enemies. We’ve also come to realise that we need to monitor the people who are monitoring them…
Carla Dean: Well who’s gonna monitor the monitors of the monitors.
Robert Clayton Dean: I wouldn’t mind doing a little monitoring myself.
Carla Dean: Yes, and you’ve got lots and lots of monitoring to do.
Eric Dean: Are you guys talking about sex?
http://www.imdb.com/title/tt0120660/quotes#qt0171269
sorry, I just couldn’t resist to quote fav movie 😉