YouTube has warned creators over the last few months that changes are coming to the platform to comply with Children’s Online Privacy Protection Act (COPPA). The changes now enter full effect: targeted ads will now be restricted from running on kids’ videos, and kids’ videos will lose access to comments and some other community features.
YouTube has said that kid-focused channels will see “a significant business impact” due to reduced ad revenue. YouTube will also start running promotions for YouTube Kids, a separate app that filters the type of content users can see, on all kids’ videos. YouTube also told creators back in September that they’ll soon be required to “designate their content as made for kids or not made for kids.”
These changes come as part of a $170 million settlement with the Federal Trade Commission over alleged violations of the Children’s Online Privacy Protection Act (COPPA). Anyone watching a video that’s been designated as made for children will now be seen as a viewer under the age of 13 years old, regardless of how old the user is. Targeted advertising won’t be run on videos designated as children’s content, and certain features including being able to send push notifications will be disabled.
A new blog post from YouTube reads, “Many creators around the world have created quality kids’ content for their audiences, and these changes will have a significant impact. We’re committed to helping creators navigate this new landscape and to supporting our ecosystem of family content.”
YouTube still can’t describe what content is “made for kids” and what isn’t, because ultimately, it’s up to the FTC to enforce the rules. The FTC defines the category as being intended for kids, taking into factor what the subject matter of a video is, including if it emphasizes kids’ characters, themes, toys, games, and more. Whether this content category includes games is not clear yet.
YouTube has recommended creators team up with their own legal counsel outside of YouTube if they’re concerned. The blog post further reads, “We also use machine learning to help us identify this content, and creators can update a designation made by our systems if they believe it is incorrect. We will only override a creator designation if abuse or error is detected.”