YouTube steps up takedowns as concerns about kids' videos grow
The streaming video service removed more than
50 user channels in the last week and stopped running ads on over 3.5 million
videos since June.
SAN FRANCISCO - YouTube stepped up
enforcement of its guidelines for videos aimed at children, the unit of Alphabet
Inc’s Google said on Wednesday, responding to criticism that it has failed to
protect children from adult content.
The streaming video service removed more than
50 user channels in the last week and stopped running ads on over 3.5 million
videos since June, YouTube vice president Johanna Wright wrote in a blog post.
“Across the board we have scaled up resources
to ensure that thousands of people are working around the clock to monitor,
review and make the right decisions across our ads and content policies,”
Wright said. “These latest enforcement changes will take shape over the weeks
and months ahead as we work to tackle this evolving challenge.”
YouTube has become one of Google’s
fastest-growing operations in terms of sales by simplifying the process of
distributing video online but putting in place few limits on content.
Parents, regulators, advertisers and law
enforcement have become increasingly concerned about the open nature of the
service. They have contended that Google must do more to banish and restrict
access to inappropriate videos, whether it be propaganda from religious
extremists and Russia or comedy skits that appear to show children being
forcibly drowned.
Concerns about children’s videos gained new
force in the last two weeks after reports in BuzzFeed and the New York Times and an
online essay by British writer James Bridle pointed out questionable clips.
A forum on the Reddit internet platform
dubbed ElsaGate, based on the Walt Disney Co princess, also became a repository
of problematic videos.
Several forum posts Wednesday showed support
for YouTube’s actions while noting that vetting must expand even further.
Common Sense Media, an organisation that
monitors children’s content online, did not immediately respond to a request to
comment about YouTube’s announcement.
YouTube’s Wright cited “a growing trend
around content on YouTube that attempts to pass as family-friendly, but is
clearly not” for the new efforts “to remove them from YouTube.”
The company relies on review requests from
users, a panel of experts and an automated computer program to help its
moderators identify material possibly worth removing.
Moderators now are instructed to delete
videos “featuring minors that may be endangering a child, even if that was not
the uploader’s intent,” Wright said. Videos with popular characters “but
containing mature themes or adult humour” will be restricted to adults, she
said.
In addition, commenting functionality will be
disabled on any videos where comments refer to children in a “sexual or
predatory” manner.
Regards
Pralhad Jadhav
Senior Manager @ Knowledge
Repository
Khaitan & Co
No comments:
Post a Comment