You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Who takes responsibility for web content?

A 19-year old living in Florida who went by the screen name of CandyJunkie overdosed on pills while broadcasting himself in real time. This webcast went on for hours on Justin.tv before viewers became alarmed and informed the police. The “broadcasting” ends with police barging into the room.

This story is extremely disturbing, not only because of the content, but also because it brings us back to the big question of whether or not someone should be responsible for content on web sites.

With increased bandwidth and higher broadband penetration, more amateur videos and live video streams are becoming available on the Internet. One wonders, however, how the industry will deal with content that contains images of crime, violence, sex, and other material that could be potentially hazardous to certain audiences such as children.

Justin.tv relies on a user-regulatory system where people can flag questionable content. It also has a list of rules in its terms of use, including the ban of content that a “reasonable person could deem to be objectionable, offensive, indecent, pornographic, invasive of another’s privacy, harassing, threatening, embarrassing, distressing, vulgar, hateful, racially or ethnically offensive, or otherwise inappropriate.”

Not all websites, however, have their own regulations on how to deal with content that was generated by outside users. In the case of Zeran v. America Online, for example, AOL was found by the court to have no liability for false, defamatory content posted on its site. In fact, AOL didn’t even bother to remove the postings for a long time (even though an innocent man was receiving death threats because of the posts) until exposed by the media.

Certainly, different countries have different approaches on how to deal with these issues– some require stronger responsibility of the web sites. In Italy, for example, Google employees may face charges for “failing to stop the publishing” of a video showing a disabled teenager being bullied. In South Korea, governments order services to take down content that can “threaten national security.”

But how can someone determine how detrimental the content is– and should they? Visual content regulation, until now, has mostly been self-regulated by the industry– such as the rating system of movies adopted by the Motion Picture Association. While legal experts figure out how things work out in cyberspace, we could at least encourage a self-regulatory system in the industry. The law may not require Internet services and service providers to be responsible, but the public could take on a stronger responsibility. In the case of the 19-year old, for instance, if people had disapproved of the video and alerted authorities a couple hours earlier, the boy may still be alive.

Comments are closed.

Log in