YouTube is rolling out more AI-driven technology to find more videos that may require an age limit, which means more viewers will be required to log into their accounts to verify their age before watching, TheVerge reported.
Similar to how YouTube began using machine learning technology in 2017 to try to better capture violent extremism and the most serious content on more platforms, as well as videos that later turned out to include hate behavior, in this case, the same approach will be used to automatically flag videos YouTube deems inappropriate for their age. As a result, YouTube expects to see more videos with age restrictions.
The company is preparing to make some label mistakes, as with any ai moderation technology rollout. And as part of the change, people who watch embedded YouTube videos on third-party sites will be redirected to YouTube to log in and verify their age.
One of the biggest questions facing creators in YouTube's partnership programme -- those who can monetise video -- is whether the restrictions will have an impact on their earning potential. YouTube's team doesn't see it that way, as it expects that most of the videos that will receive automatic age limits are also likely to violate the company's advertising-friendly guidelines. Basically, according to YouTube, these videos will already have limited or no advertising.
That doesn't mean that errors don't happen, they do, as countless incidents of misapplied tagging and forensics and various forms of copyright strike disputes have shown in the past. But YouTube is beefing up its appeal team to handle the appeals it receives. Another concern for the creators is that age-restricted videos will not appear on the home page. Although age-restricted videos are unlikely to appear on the home page, the age limit does not automatically prevent videos from appearing on the home page, according to YouTube.
YouTube has recently been trying to address criticism from parent groups and advocacy committees around the world that the site is unsafe for children. YouTube teams have often said that YouTube is not suitable for people under 13 because of federal privacy protections, and the company has pointed to YouTubeKids as what it calls a safer option. However, this does not prevent young children from using the app at home or elsewhere. Some of the most popular channels are built around creating content specifically for children. YouTube's trust and security teams now restrict videos when they encounter them during the review process. An age limit will be imposed if it is deemed unsuitable for people under 18.
"As our use of technology will lead to more age-restricted videos, our policy team is taking this opportunity to review our boundaries for age-restricted content." "Says a new blog post on YouTube. "After consulting with experts and comparing it with other content rating frameworks around the world, we only need to make small adjustments."
The YouTube post also points out that the new rules may require some additional steps for people from EU countries. Under regulations such as the forthcoming EU Audiovisual Media Services Directive (AVMSD), some European users may be required to provide additional proof of age. In fact, if the system can't verify that someone is over 18, they may be asked to "provide a valid ID or credit card to verify their age," according to the post. This is a one-time process, and YouTube should delete messages after they are sent. YouTube says the process was set up to comply with Google's privacy and security principles.