Alphabet’s Google is stepping up the battle against extremist content by introducing new measures to identify & remove extremist content from its video sharing platform YouTube.

Technology and more human eyeballs are helping YouTube more quickly find and remove terrorist content from the video-sharing site.

Last month, YouTube said it was initiating a multipronged strategy to combat how extremist groups including ISIS use video on the site to recruit and radicalize prospective terrorists.

Machine learning has helped the YouTube detect and remove controversial content more quickly, the company said Tuesday in a blog post. Over the past month, three-fourths of the violent or extremist videos removed were taken down before a person on YouTube flagged it for inappropriateness. “Systems have proven more accurate than humans at flagging videos that need to be removed,” the YouTube team says in the post.

As YouTube deployed machine learning technology over the past month, the number of videos removed has more than doubled, as has the rate of removal. “With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge,” YouTube says.

The site’s program of “Trusted Flaggers,” human experts who help spot problem videos, has been bolstered by cooperation from 15 more non-governmental organizations including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. 

New standards will be applied to videos that are not illegal but users have flagged by users “as potential violations of our policies on hate speech and violent extremism,” YouTube says. Those videos may remain on the site, but will not be recommended, make money from ads or have comments. “We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter,” the company says.

Less than two weeks ago, YouTube began redirecting searches for extremist and terrorist words to a playlist of antiterrorist content. YouTube’s Creators for Change program last week hosted a two-day workshop in the U.K. for teens to “help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet,” YouTube says. It also plans to expand that program to reach 20,000 more U.K. teens across. 

Online extremism has been a problem…