YouTube plans to modify profanity rules that prompted creator backlash
YouTube's gaming community pushed back against the company this week after some creators saw their old videos demonetized out of the blue.
The culprit is a new policy that the company introduced back in November in order to make certain kinds of content more advertiser friendly. That change, made to YouTubes's advertiser-friendly content guidelines, overhauled the platform's approach to profanity and violence.
The good news is that while we don't quite know what the company will do yet, YouTube is apparently listening to creators' concerns.
"In recent weeks we've heard from many creators regarding this update," YouTube spokesperson Michael Aciman told TechCrunch. "That feedback is important to us and we are in the process of making some adjustments to this policy to address their concerns. We will follow up shortly with our creator community as soon as we have more to share."
In November, YouTube expanded its definition of violence beyond real-world depictions, including in-game violent content "directed at a real named person or acts that are manufactured to create shocking experiences (such as brutal mass killing)." The company said that gore in "standard game play" was fine, but only after the first 8 seconds of a video. The whole section left plenty of room for interpretation, for better or worse.
The changes to its profanity policy were more drastic. YouTube announced that it would no longer count "hell" and "damn" as profane words, but all other profanity would be lumped together instead of differentiated based on severity (e.g. words like "shit" and "fuck" would now be treated the same way). Further, "profanity used in the title, thumbnails, or in the video's first 7 seconds or used consistently throughout the video may not receive ad revenue," according to the new policy.
If the swearing kicks in after the first 8 seconds of a video, it's still eligible, but some of the changes stood to affect a massive swath of videos --many of which were made well before the changes were announced. Creators started noticing the new policies in effect around the end of December, watching some videos be slapped with new restrictions that limit their reach and ad eligibility.
YouTube creator Daniel Condren, who runs RTGame, explored the impact of the policy change on his own channel in a video that racked up more than a million views this week. Condren has been grappling with the enforcement changes in recent weeks after seeing roughly a dozen videos demonetized and his request for appeals rejected.
I am so sorry to have to keep tweeting this - but overnight, 6 more of my videos have now become limited suddenly, including my Best of 2020. No notification from YouTube at all on any of these. This is genuinely awful @TeamYouTube pic.twitter.com/UHfSJA1FCt
-- RTGame Daniel (@RTGameCrowd) December 29, 2022
"I genuinely feel like my entire livelihood is at risk if this continues," Condren wrote on Twitter. "I'm so upset this is even happening and that there seems to be nothing I can do to resolve it."
YouTube didn't respond to our follow-up questions about how it plans to tweak the policy, but we're certainly curious if the platform will roll back enforcement for old, previously published videos that creators might rely on for income.
In the face of emerging regulation targeting social media's relationship with underage users, the company is clearly trying to make its massive trove of videos more age-appropriate (and advertiser friendly). But retrofitting age restrictions and new monetization rules onto a platform like YouTube is a delicate balance -- and in this case the changes had a swift, sweeping impact that gave creators little time to adapt.