In December 2017, Logan Paul, a well-known vlogger, uploaded a video to YouTube that showed an apparent suicide victim’s body. Ten days later, YouTube addressed the issue and their policies. As for Paul, his channel was removed from a premium advertising program, he was placed on a 90-day probation, and his partnership projects were put on hold — but he is still making money through YouTube’s general ad program.
"We expect more of the creators who build their community on @YouTube, as we’re sure you do too. The channel violated our community guidelines, we acted accordingly, and we are looking at further consequences," YouTube said in a statement posted to Twitter in January 2018. "We know that the actions of one creator can affect the entire community, so we’ll have more to share soon on steps we’re taking to ensure a video like this is never circulated again."
(Paul apologized for his video after facing backlash, saying on Twitter, "I intended to raise awareness for suicide and suicide prevention and while I thought 'if this video saves just ONE life, it'll be worth it,' I was misguided by shock and awe, as portrayed in the video.")
The Paul video scandal points to a larger issue confronting YouTube as a whole: One distasteful video from one large channel can make all of YouTube look bad. The content creators, the team behind the platform, advertisers, viewers… It affects everyone, and it has for some time. YouTube has made policy changes to address the issues, but it’s a hard line to walk between the platform and creators — especially when not all creators are acting in good faith.
That's because Paul’s poorly thought-out videos are not the only posts giving YouTube a bad name amongst fans. After the mass school shooting in Parkland, Florida, in February, videos of conspiracy theorists saying that the survivors of the shooting were just crisis actors were promoted on YouTube. (YouTube said in a statement that a video had been "misclassified" by the system and that once the error was realized, it was removed "from Trending and from YouTube for violating our policies.")
And recently, numerous videos that exploit children have been popping up on YouTube. These videos seem harmless, featuring well-loved characters like Elsa and Spiderman, but then they take a turn for the worse, as the adult vloggers dressed up like characters partake in bizarre, usually violent, and even sexual acts. (In August 2017, YouTube said videos that "made inappropriate use of family friendly characters" would not be allowed to be monetized, and in November 2017, YouTube's director of policy, Juniper Downs, said they were "in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged.")
And YouTube has continued to work on ways to address these issues. The video-sharing site launched a new campaign and announced in December they would hire 10,000 new content moderators to take down videos by extremists, conspiracy theorists, and others who violate policies. (However, some channels were reportedly taken down completely, out of nowhere, without even having violated any guidelines, a mistake the company attributed to “newer members” of its team, according to Bloomberg.)
Additionally, on Jan. 16, the site introduced new policies to its YouTube Partner Program (YPP), changing its eligibility requirement for monetization to 4,000 hours of watchtime within the past 12 months and 1,000 subscribers. Essentially, this means that only larger YouTube users that attract lots of views would be eligible for the YPP, where they can make money through YouTube. This way, the intention is, extremist channels will get weeded out.
“Our recent changes to the YouTube Partner Program (YPP) are designed to curb bad actors, stabilize creator revenue and provide greater assurances to advertisers around where their ads are placed,” a YouTube spokesperson says in a statement given to Elite Daily. “By making these updates to YPP, we aim to help creators of all sizes find more success.”
The problem of harmful videos goes beyond giving YouTube (and creators on YouTube) a bad image. A parent may put on a fun or educational video for a child, and then one of the Elsa and Spiderman videos — not suitable for children — may pop up. Or, advertisers may unexpectedly be positioned next to inappropriate videos, all due to an inner algorithm. Therefore, another step YouTube is taking to correct all of this has to do with Google Preferred; through this program, companies’ ads will only run next to videos that have been verified as compliant with the new guidelines by an actual person.
While the new numbers for creators (4k hours of watchtime over the past 12 months and 1k subscribers) have been publicized heavily, this new deal with Google Preferred seems even more relevant in this larger conversation, since YouTube relies on ad revenue to pay creators and employees. In fact, more creators than ever are earning a living on YouTube, with the number of channels making over six figures up over 40 percent year-over-year, according to YouTube’s Creator Blog.
Vloggers like Matt Murley understand the need for the updates.
“At the very least, I think that they will clear out the clown car and help to better establish a meritocracy on YouTube,” Murley, who joined YouTube in 2012, tells me in an interview for this article. He claims that “there are a lot of channels out there that are racking up millions of views by uploading stolen content, and they are able to monetize because they hit their 10,000-view benchmark a long time ago. The requirement to meet a minimum subscriber count should eliminate a lot of these channels' ability to monetize since they often have very low subscriber numbers, due to their lack of ongoing value.” And YouTube's new policies were introduced in part to limit content stealing.
Murley started regularly uploading content to YouTube in August 2017, gaining 2,500 views across his channel. He knew that the new guidelines didn’t favor small channels like his, but he realized that they were what was best for the community overall.
“I think that from here on out, the only people making money from YouTube will be the ones who deserve to,” he says. “They may be tough numbers for beginners to overcome, but the ones who are good enough will float to the top. It's not YouTube telling us who can and can't monetize, it's the community. If a creator's content is good enough, the community will reward them by watching and subscribing.”
Not everyone sees it like that, though. Willow Biggs uploads unboxing videos and product reviews, which usually only last around 5 minutes. Therefore, she feels like reaching 4,000 hours is like chasing after an imaginary finish line.
“I did a giveaway and some collabs with other YouTubers and pulled in over 250 new subscribers within two weeks,” she tells me in an interview. “Since then, my numbers have slowed down again, although I am still uploading videos every night. I think it’s really about communication and committing to supporting aspiring YouTubers and videos that are accessible to everyone.”
Biggs noted that many of her friends have stopped the YouTube game altogether since the new guidelines were announced, and even big players in the game like H3H3 have commented about how things are different now.
In the end, though, this site is about sharing videos, whether money is made or not… and I like Murley’s view on it: “Success shouldn't be determined by money, it should be determined by how much time you spend doing what you love. I love creating content for YouTube, even though it's never made me a penny.”