Sunday, April 17, 2016

Sunday Long Read: Everything In Moderation

This week's Sunday Long Read is definitely this piece from The Verge on the history of moderating content on the Internet's public forums in the earlier days of the Wild West, that ancient period of history known as "The Oughts", places like YouTube and their SQUAD (Safety, Quality, and User Advocacy Department).

Launched in 2005, YouTube was the brainchild of Chad Hurley, Steve Chen, and Jawed Karim—three men in their 20s who were frustrated because technically there was no easy way for them to share two particularly compelling videos: clips of the 2004 tsunami that had devastated southeast Asia, and Janet Jackson’s Superbowl "wardrobe malfunction." In April of 2005, they tested their first upload. By October, they had posted their first one million-view hit: Brazilian soccer phenom Ronaldinho trying out a pair of gold cleats. A year later, Google paid an unprecedented $1.65 billion to buy the site. Mora-Blanco got a title: content policy strategist, or in her words, "middle man." Sitting between the front lines and content policy, she handled all escalations from the front-line moderators, coordinating with YouTube’s policy analyst. By mid-2006, YouTube viewers were watching more than 100 million videos a day.

In its earliest days, YouTube attracted a small group of people who mostly shared videos of family and friends. But as volume on the site exploded, so did the range of content: clips of commercial films and music videos were being uploaded, as well as huge volumes of amateur and professional pornography. (Even today, the latter eclipses every other category of violating content.) Videos of child abuse, beatings, and animal cruelty followed. By late 2007, YouTube had codified its commitment to respecting copyright law through the creation of a Content Verification Program. But screening malicious content would prove to be far more complex, and required intensive human labor.

Sometimes exhausted, sometimes elated, and always under intense pressure, the SQUAD reviewed all of YouTube’s flagged content, developing standards as they went. They followed a guiding-light question: "Can I share this video with my family?" For the most part, they worked independently, debating and arguing among themselves; on particularly controversial issues, strategists like Mora-Blanco conferred with YouTube’s founders. In the process, they drew up some of the earliest outlines for what was fast becoming a new field of work, an industry that had never before been systematized or scaled: professional moderation.

By fall 2006, working with data and video illustrations from the SQUAD, YouTube’s lawyer, head of policy, and head of support created the company’s first booklet of rules for the team, which, Mora-Blanco recalls, was only about six pages long. Like the one-pager that preceded it, copies of the booklet sat on the table and were constantly marked up, then updated with new bullet points every few weeks or so. No booklet could ever be complete, no policy definitive. This small team of improvisers had yet to grasp that they were helping to develop new global standards for free speech.

In 2007, the SQUAD helped create YouTube’s first clearly articulated rules for users. They barred depictions of pornography, criminal acts, gratuitous violence, threats, spam, and hate speech. But significant gaps in the guidelines remained — gaps that would challenge users as well as the moderators. The Google press office, which now handles YouTube communications, did not agree to an interview after multiple requests.

As YouTube grew up, so did the videos uploaded to it: the platform became an increasingly important host for newsworthy video. For members of the SQUAD, none of whom had significant journalism experience, this sparked a series of new decisions. 
In the summer of 2009, Iranian protesters poured into the streets, disputing the presidential victory of Mahmoud Ahmadinejad. Dubbed the Green Movement, it was one of the most significant political events in the country’s post-Revolutionary history. Mora-Blanco, soon to become a senior content specialist, and her team — now dubbed Policy and more than two-dozen strong — monitored the many protest clips being uploaded to YouTube.

On June 20th, the team was confronted with a video depicting the death of a young woman named Neda Agha-Soltan. The 26-year-old had been struck by a single bullet to the chest during demonstrations against pro-government forces and a shaky cell-phone video captured her horrific last moments: in it, blood pours from her eyes, pooling beneath her.

Within hours of the video’s upload, it became a focal point for Mora-Blanco and her team. As she recalls, the guidelines they’d developed offered no clear directives regarding what constituted newsworthiness or what, in essence, constituted ethical journalism involving graphic content and the depiction of death. But she knew the video had political significance and was aware that their decision would contribute to its relevance.

Mora-Blanco and her colleagues ultimately agreed to keep the video up. It was fueling important conversations about free speech and human rights on a global scale and was quickly turning into a viral symbol of the movement. It had tremendous political power.They had tremendous political power. And the clip was already available elsewhere, driving massive traffic to competing platforms.

The Policy team worked quickly with the legal department to relax its gratuitous violence policy, on the fly creating a newsworthiness exemption. An engineer swiftly designed a button warning that the content contained graphic violence — a content violation under normal circumstances — and her team made the video available behind it, where it still sits today. Hundreds of thousands of individuals, in Iran and around the world, could witness the brutal death of a pro-democracy protester at the hands of government. The maneuvers that allowed the content to stand took less than a day.

YouTube's SQUAD set the ground rules for content on the website, and when Google bought them those rules became the ground rules for the Internet itself.  The people involved in SQUAD had to react daily to new challenges and new content.  The people who moderate that content today now have immense power over global information that could potentially be shared with billions...or censored for billions, sometimes regardless of national laws.

These folks have a lot of power, far more than we've ever seen in the Information Age.

So who moderates them?

No comments:

Related Posts with Thumbnails