One thing I’ve always admired about Yahoo is that they offer users a wide variety of tools to create online communities but push the management and ownership responsibilities out to those who use those tools. Not only does this reduce the management costs and, potentially, the legal responsibilities of Yahoo, it also gives users a sense of ownership over the communities they build using Yahoo’s tools.
Given the apparent "give users tools and they’ll use them" ethos of Yahoo, it comes as a bit of a surprise to find Reuters reporting that Yahoo will no longer be offering user created chat rooms "amid concerns that adults were using the sites to try to have sex with minors."
"The user-created chat rooms in question, where Internet users converse in real time, had names including “Girls 13 And Under For Older Guys” and “Girls 13 And Up For Much Older Men” and were all listed under “education chat rooms,” Houston television station KPRC reported…KPRC reported last month that major advertisers including PepsiCo Inc., Georgia-Pacific Corp. and State Farm Mutual Automobile Insurance Co. removed their ads after the station found the ads were appearing on Yahoo user-created chat rooms that were aimed at sex with children. “As soon as we found out we pulled our ads,” said Pepsi spokesman Dave DeCecco. “We were totally unaware our ads were associated with those chat rooms — and that was back in April.” [whole article]
The move by Yahoo comes after a $10 million lawsuit was filed against the last month on behalf of a 12-year-old molestation victim and following a long campaign by watchdog groups to persuade Yahoo and other large Internet portals to purge their sites of child porn. [whole article]
Yahoo’s announcement follows a similar, and much criticised, move by MSN to close it’s chat room service several months ago. Although MSN announced the closure on child safety grounds, many critics felt the closure was more likely to have been caused by financial considerations.
I’ve run the 1800+ member Cybersociology list for years on the Yahoo Groups platform with no problems at all and have appreciated the ability to manage the list, the way it’s presented to users, it’s format, etc myself. However, as an online community professional, I also understand the need, both from a legal and brand protection standpoint, and for the protection of users, for Yahoo and other service providers offering community tools to do something. So what’s the answer?
When a child becomes the victim of a paedophile in a park, we don’t hear people campaigning for all parks to be closed. Similarly, we often hear politicians and the media complaining that the internet is making it easier for terrorist and criminal organisations to organise their groups, but we never hear politicians and the media calling for the end of all postal mail or shutting down of the transportation or telephone networks for "security reasons".
Likewise, closing all chat rooms isn’t the answer. Chat rooms can, and do, provide a social space for people, young and old, to make friends, build communities, work on projects together, and have fun. I do believe that moderation should be a legal and moral necissity for websites targeted at or likely to attract children but it is resource intensive and expensive. The only practical way to ensure that children stay safe when their online is for parents to realise that the internet is a public place and they have to take responsibility themselves for monitoring their children’s online activity and for educating their children about the potential dangers of meeting people online. Schools, governments, non-profit organisations, and anyone offering unmoderated online spaces should also be working to educate users, particularly parents and children. The BBC’s Chat Guide, a website backed up by materials, including a video for teachers to use in their classrooms, is a good example of a chat safety education effort. Other useful advice can be found at the following websites:
Related Cybersoc Entries: