Updated

YouTube is cracking down on white supremacist content and conspiracy theories in an aggressive attempt to stop the spread of hate speech that will block thousands of channels that are in violation.

The Google-owned video platform, which sees 500 hours of content uploaded every single minute, has been working in a range of ways over the last few years to scrub the site of hate-filled content and conspiracy theory videos as a rising chorus of critics and lawmakers have slammed the tech giant for becoming a breeding ground for white supremacy and other forms of discrimination.

The California company told Fox News on Wednesday that it will now be "specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status."

That would include, for example, any videos promoting or glorifying Nazi ideology.

SILICON VALLEY IN THE CROSSHAIRS: BIG TECH COMPANIES LIKE AMAZON, APPLE AND GOOGLE FACE BIPARTISAN ONSLAUGHT

Over the past few years, YouTube has been in consultation with dozens of experts in subjects like civil rights, free speech and extremism as it seeks to evaluate and improve its existing hate speech policy. In 2017, the company introduced a harder stance against hate speech content that limited their reach on the platform — reducing views on those videos by an average of 80 percent.

However, a range of civil rights groups and tech critics have said the company is not doing enough. In April, the livestream for a House hearing on white nationalism had to be shut down because it was flooded with hateful and racist comments. A ProPublica investigation earlier this year found that white supremacists and neo-Nazis are using YouTube to spread their vile messages and recruit new members.

The Google-owned tech platform is cracking down on white supremacist content and other hate speech. (Getty Images/YouTube)

YouTube, which has taken heat for not doing enough to stop the spread of misinformation around a range of issues, such as vaccination, or major events, like the Moon landing — will now remove content that denies well-documented violent events that actually happened. That would include taking down Holocaust denial videos or videos falsely claiming that the Sandy Hook school massacre was a hoax.

AMAZON'S JEFF BEZOS DROPPING $80M ON MASSIVE NYC 'MEGA-HOME'

Last year, YouTube kicked conspiracy theorist and radio host Alex Jones off the platform.

The powerful video platform, which has 1.8 billion daily users, is also strengthening the enforcement of its existing YouTube Partner Program policies, so that when channels "brush up against" the hate speech rules, they will be suspended from the Partner Program, meaning they won't be able to run ads on their channel or use other features to make money on the site.

All of these policies are effective immediately, but the company said it will take a few months for its systems to fully ramp up.

"The openness of YouTube’s platform has helped creativity and access to information thrive. It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence. We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come," the company said in a statement.

Organizations that work to combat hate speech cheered YouTube's announcement but said more must be done.

“Online hate and extremism pose a significant threat — weaponizing bigotry against marginalized communities, silencing voices through intimidation and acting as recruiting tools for hateful, fringe groups,” Jonathan Greenblatt, CEO and National Director of the Anti-Defamation League, said in a statement via email to Fox News. “While this is an important step forward, this move alone is insufficient and must be followed by many more changes from YouTube and other tech companies to adequately counter the scourge of online hate and extremism.”

Color of Change, a racial justice organization that has worked with YouTube and other tech giants to advocate for policies to stop hate speech, said that Wednesday's announcement is only a first step, emphasizing that YouTube must make the protection of civil rights an "operational priority" and not simply treat it as a PR crisis.

The racial justice group is calling on YouTube to provide full transparency on who is being consulted to implement the new policies, precisely how they define white nationalism and white supremacy, and what training data is being used to build the digital tools meant to identify and remove hateful content.

“Generating YouTube traffic is the business model of the white nationalist movement today. With no fact checks or moderation, YouTube monetizes the spread of racism, misogyny, and all forms of bigotry, sharing the profits with hate leaders. Different from other digital platforms where hateful actors must direct users to external outlets, YouTube’s revenue sharing literally pays white nationalists to propagate hate," Color of Change President Rashad Robinson told Fox News in a statement via email.

CLICK HERE FOR THE FOX NEWS APP