YouTube is to restrict the availability of videos showing children’s characters in violent or sexual scenes if they are reported by viewers.
Last week, a blog post by writer James Bridle highlighted how YouTube was still being swamped by bizarre and indecent videos aimed at children.
The site says it already stops such videos earning advertising revenue.
YouTube said its team was “made up of parents who are committed to improving our apps and getting this right”.
But critics say YouTube is not taking enough action by waiting for viewers to report inappropriate videos.
The problem of video-makers using popular characters such as Peppa Pig in violent or sexual videos, to frighten children, has been widely reported.
However, Mr Bridle’s blog post went deeper into what he called the rabbit hole of children’s content on YouTube.
He gave examples of videos aimed at children that were not necessarily violent or sexual but were sinister, “disturbing” or otherwise inappropriate.
Often it appeared that the videos had been algorithmically generated to capitalise on popular trends.
“Stock animations, audio tracks, and lists of keywords being assembled in their thousands to produce an endless stream of videos,” he said.
Many used popular family entertainment characters such as Spiderman, and Elsa from Frozen, and had been viewed millions of times.
“Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale,” he wrote.
YouTube says it has already barred such videos from earning advertising money when they are reported by viewers, to try to remove the incentive to produce them.
However, many of the videos do not get reported by viewers and continue to carry advertisements.
YouTube has now said it will give such videos an age restriction if they are reported by viewers, so they cannot be viewed by people under 18.
Age-restricted videos are blocked from appearing in the YouTube Kids app, which is primarily curated by algorithms.
They also cannot be viewed on the YouTube website unless people are logged in with an adult’s account.
However, a report in the New York Times found that inappropriate videos have previously slipped through the net.
YouTube says it uses human reviewers to evaluate whether flagged videos are appropriate for a family audience.
In his blog post, Mr Bridle said he did not know how YouTube could stamp out the problem.
“We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay,” he said. Source: BBC