YouTube is accelerating its efforts to combat online extremism content.
Using a combination of expanded technological measures and human analysis, YouTube and Google, the video site’s parent company, say they plan to do a better job of tackling violent extremist videos and content online.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now,” said Kent Walker, senior vice-president and general counsel of Google, in a blog post on Google.org.
In March, Google and YouTube found themselves facing irate advertisers, with many pulling their business, after they found their ads played on videos promoting terrorism and extremist content on the video service. They moved to establish a 10,000-viewer requirement for access into its YouTube Partner Program, which lets creators earn revenue via ads running on their videos.
The tech giant also improved its use of machine learning technology to prevent ads from being automatically run with extremist or other violent content. That has continued, Walker said in the essay. “We are increasing our use of technology to help identify extremist and terrorism-related videos. We will now devote more engineering resources to apply our most advanced machine learning research to train new ‘content classifiers’ to help us more quickly identify and remove such content,” he said.
YouTube will also get more “Trusted Flaggers,” human experts who help spot problem videos, Walker said. The non-governmental organizations that help Google and YouTube find troublesome content will be nearly doubled, with 50 more NGOs added to 63 current participants. Google will support them with operational grants, he said.
Videos that do not clearly violate YouTube’s polices — that contain inflammatory religious or supremacist content — will appear with a warning and will not be able to gain revenue with ads. Nor will viewers be able to endorse or comment on them, Walker said. “That means these videos will have less engagement and be harder to find,” he said. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”
In addition to its Creators for Change program, launched last year to promote voices against hate and radicalization, YouTube will work with Jigsaw, an incubator company within Google’s parent company Alphabet, to redirect potential Isis recruits to anti-terrorist videos.
Along with Facebook, Microsoft and Twitter, Google and YouTube are working to establish an international online terrorism forum. “Together, we can build lasting solutions that address the threats to our security and our freedoms,” Walker said. “It is a sweeping and complex challenge. We are committed to playing our part.”
Follow USA TODAY reporter Mike Snider on Twitter: @MikeSnider.
Read or Share this story: https://usat.ly/2sHhYgX