
YouTube is preparing to revise its policies to limit creators' ability to monetise mass-produced and repetitive content, much of which has surged due to advances in AI technology.
The company will update its YouTube Partner Program (YPP) Monetisation policies on July 15, clarifying what types of content can be used to earn money on the platform.
While YouTube has not yet released the exact policy changes, it has stated that creators must continue to upload "original" and "authentic" content.
The revised guidelines will specifically address what constitutes "inauthentic" content, which is becoming increasingly prevalent due to the growing influence of AI tools in content creation.
Some creators raised concerns that the new rules could impact videos such as reaction videos or those using clips from other media, fearing these may be considered inauthentic.
However, YouTube's Head of Editorial and Creator Liaison, Rene Ritchie, clarified that the changes are simply a "minor update" to existing policies.
The primary goal, Ritchie explained, is to better identify content that is mass-produced or repetitive, which has long been ineligible for monetisation due to its tendency to be viewed as spam.
The surge in AI-generated content, often referred to as "AI slop," has raised alarms within the platform.
These low-quality videos, produced using generative AI tools, frequently feature AI voices or repurposed images and video clips.
Channels using AI-generated music or news content have gained millions of followers, while fake videos have garnered significant attention.
404 media reported the case of one viral true crime series, which turned out to be entirely AI-generated, highlighting how far this trend has reached.
YouTube is eager to implement clearer guidelines to protect the platform's integrity and prevent low-quality content creators from profiting through the YPP.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ