The April 3 shooting at YouTube shines a macabre spotlight on the debate that's ripped through the online video giant for the past year.
Nasim Aghdam, a 39-year-old woman from San Diego who made YouTube videos, arrived at the company's headquarters in San Bruno, California, around lunchtime with a 9-millimeter Smith and Wesson handgun. She opened fire, injuring at least three, before killing herself. Police said they believed she was “upset with the policies and practices of YouTube.”
Aghdam was not a mainstream YouTube creator. In her videos, which YouTube removed on Tuesday, she danced bizarrely, touted vegan activism and occasionally posted graphic scenes of animal cruelty. Aghdam's frustration with YouTube was more conventional. “YouTube filtered my channels to keep them from getting views!” A website linked to her reads. “There is no equal growth opportunity on YOUTUBE or any other video sharing site.”
Dozens of prominent creators have made.
This is part of a broader debate over the power of large Internet companies and how to spread the fake news, conspiracy theories and hate. YouTube, part of Alphabet Inc's Google, has been wrestling with how to make the site safer for users and advertisers without damaging the careers of its creators or censoring free speech.
YouTube executives have struggled with this balancing act for years, but the challenge has become particularly acute in the past year. YouTube began a steady overhaul of its content policies in 2017 after vocal criticism from advertisers and the public. Changes included stripping ads from some videos.
In response, several video creators complained of the company's poorly communicated updates, had limited their audiences, slashed their advertising revenue, and threatened livelihoods.
Part of the problem is that YouTube has never clearly articulated its policy changes, according to Jonathan Albright, a researcher at the Tow Center for Digital Journalism at Columbia University. “It's always a response.” It's never really proactive, “he said. “This is an example that shows a direct link between opaque policies and data transparencies, with these giant platforms, into people dying.”
A YouTube spokeswoman did not respond to multiple requests for comment. YouTube chief executive officer Susan Wojcicki expressed horror after the shooting. “Our hearts go out to all those injured & impacted today.” We will come together to heal as a family, “she wrote on Twitter Tuesday.
Beginning early last year, marketers paused spending on the video site after ads ran in front of racist and extremist videos. YouTube first responded to advertisers' alerts by stripping ads from hundreds of thousands of channels, costing as much as 80% of their monthly earnings. This wave of “demonetisation,” as it became known, angered and confused many in the community. Popular YouTubers Phil DeFranco and Casey Neistat sought to quell that anxiety, and granted creators to explore alternative sources of revenue.
Then a steady drumbeat of YouTube scandals – juvenile antics from prominent creators; grotesque videos aimed at children; conspiracy theories surfacing around news – prompted YouTube to clamp down further. In January, YouTube rolled out its most restrictive policies to date. To make money off ads, creators have had at least 10,000 views and 1,000 subscribers. At the time, the company said that 99% of the video ads are earned less than US $ 100 a year from YouTube ads.
YouTube has tried to explain that the changes are intended to protect the majority of creators from the misbehaviour of a few. That did not quiet the dissenters. Small creators organized “demonetization day” when many of the changes were rolled out.
“We had dreams of being something on YouTube, and it was definitely easier to do when we were monetized,” a creator under the name “PeachLoveHappiness.” Her channel has 427 subscribers, under the new threshold of 1,000.
The only way she could monetise would be to generate more than 4,000 watch hours. The 15 minute clip has fewer than 2,000 views so far. – Bloomberg