Updated

Deep in the trenches of Facebook's sprawling Menlo Park, Calif. campus, security experts, threat intelligence investigators, data scientists and engineers are manning a war room to root out misinformation and bad actors ahead of next month's midterm elections.

"This is actually the culmination of two years of massive investments that we've made, both in people and technology, to make sure that our platforms are safe and secure for elections," said Samidh Chakrabarti, Facebook's elections and civic engagement chief.

A group of 20 team leaders represent 20,000 workers from across the company. In a converted meeting room, desks are arranged in a horseshoe, flanked by monitors, live TV screens, and displays of bar graphs showing spikes that signal possible suspicious activity on the platform. The team is working 24/7 to monitors reports on posts, pages and links in real time, to detect "coordinated inauthentic behavior."

"If I am representing to you that I'm independent but actually I'm running 50 different pages together and they all look independent but they're not, they're being coordinated,” said Nathaniel Gleicher, head of Facebook's cybersecurity. “That's an example of inauthentic behavior.”

Facebook works with third-party fact checkers to follow up on reports of bad actors trying to confuse or manipulate public debate. Facebook has also brought greater transparency to political advertising to combat the spread of fake news.

If Facebook's AI software detects a problem that could be spreading on the platform, it will show up on a dashboard, trigger an alarm, and the humans go to work.

"Our data scientists then analyze it, and if there is a problem, they pass it on to our operations specialists, who are then able to evaluate whether it violates our community standards, and then respond appropriately," explains Chakrabarti.

The war room recently flagged a blatant effort to manipulate voters in Brazil, after a flurry of posts claimed the Oct. 7th presidential election was being postponed for a day due to protests.

"This was not true, this is not a true story," Chakrabarti said. "And so they were able to then to pass that onto our operations team to remove it from our platform, and prevent it from going viral in the first place."

When bogus content is flagged, the company said it will either take it down, or reduce its distribution, and then let users know what the true story is, so they're better informed.

Facebook says fast detection and fast response will help safeguard elections going forward —a lesson learned the hard way, after CEO Mark Zuckerberg acknowledged that Russian agents abused the social media platform in 2016.

"We didn't take a broad enough view of our responsibility, and that was a big mistake," Zuckerberg said when he testified before Congress in April.

In fact, hundreds of bogus stories were posted every few hours, and thousands of fake accounts have since been removed. As Facebook works to regain the public's trust, analysts said the new war room is just a step in the right direction.

"I think they’re going to have to be stricter around the way they edit and manage content, the way media companies do in order to get the trust of their customers back," said analyst Tim Bajarin, president of tech consulting firm Creative Strategies.

Despite a Pew study showing nearly half of all U.S. adults get their news from Facebook, the company has always denied it's in the traditional news business.

“At our heart, we're a tech company," COO Sheryl Sandberg said in 2017. "We hire engineers. We don’t hire reporters, no one’s a journalist. We don’t cover the news."

While officials at Facebook acknowledge that no security system is 100 percent foolproof, they say having a war room with security experts sitting side by side will improve the odds of catching malicious pages or posts before any real harm is done.

Still, the war room's future is uncertain. On Nov. 7, the team will assess how well the strategy worked, identify any new threats that may have emerged, and decide whether the war room will become a permanent line of defense to help safeguard elections.