After initially placing itself at the forefront of media companies vying to capitalize on AI ordinance, YouTube has ever so slightly inched back its stance. The streaming giant will allow labels and other music rights holders to petition for AI-created music to be taken down.
YouTube's product management VPs Jennifer Flannery O'Connor and Emily Moxley announced in a blog post today that in the coming months, they're going to introduce "the ability for our music partners to request the removal of AI-generated music content that mimics an artist's unique singing or rapping voice."
Initially, the option to request removal will be available only to "labels or distributors who represent artists participating in YouTube's early AI music experiments," but it is promised to expand over the next few months.
O'Connor and Moxley also revealed that creators will have to divulge if a video contains AI-generated content before posting in order for the platform to put a warning label on it.
"When creators upload content, we will have new options for them to select to indicate that it contains realistic altered or synthetic material," they wrote. "Creators who consistently choose not to disclose this information may be subject to content removal, suspension from the YouTube Partner Program, or other penalties."
The regulations come in response to user concerns about their announcement last month detailing an AI tool that will let content creators clone artists' voices.
YouTube's product management VPs Jennifer Flannery O'Connor and Emily Moxley announced in a blog post today that in the coming months, they're going to introduce "the ability for our music partners to request the removal of AI-generated music content that mimics an artist's unique singing or rapping voice."
Initially, the option to request removal will be available only to "labels or distributors who represent artists participating in YouTube's early AI music experiments," but it is promised to expand over the next few months.
O'Connor and Moxley also revealed that creators will have to divulge if a video contains AI-generated content before posting in order for the platform to put a warning label on it.
"When creators upload content, we will have new options for them to select to indicate that it contains realistic altered or synthetic material," they wrote. "Creators who consistently choose not to disclose this information may be subject to content removal, suspension from the YouTube Partner Program, or other penalties."
The regulations come in response to user concerns about their announcement last month detailing an AI tool that will let content creators clone artists' voices.