YouTube says it will be relying more heavily on artificial intelligence to find videos that may require age restrictions.
The Google-owned company has faced criticism over the way it handles content geared toward children. YouTube has maintained that its platform isn’t intended for anyone under the age of 13 due to federal privacy laws. However, young children have continued to use the site, and content creators have continued to create videos aimed at children.
Previously, YouTube’s Trust & Safety team was tasked with applying age restrictions when they found a video that they didn’t deem appropriate for viewers under 18 during their reviews. But the process led to some videos slipping through the cracks.
Now, YouTube says it will be using AI to weed out videos that warrant an age restriction. This means more viewers will be asked to sign into their accounts to verify their age prior to watching.
“Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions,” YouTube said in a blog post.
May be problems to start
YouTube said it’s preparing for some labeling errors while the AI moderation program gets started.
“Because our use of technology will result in more videos being age-restricted, our policy team took this opportunity to revisit where we draw the line for age-restricted content,” the video platform stated. “After consulting with experts and comparing ourselves against other global content rating frameworks, only minor adjustments were necessary.”
The company added that content creators can appeal an age restriction decision if they think it was incorrectly applied.