Business & Tech

Facebook's Election 'War Room' Brings Out Big Guns

The 'War Room' division employs 24 experts working with 20,000 workers from across the company to guard against election interference.

MENLO PARK, CA — There's a war going on in the Silicon Valley — and it doesn't involve artillery. But it does involve electronic devices accessing social media at Facebook.

"War Room" was a term coined by the landmark election campaign of former U.S. President Bill Clinton in 1992 as highlighted in Joe Klein's bestselling book "Primary Colors." The catch-phrase labeled news department conference rooms on 9/11 in 2001.

Now the social media giant that dominates business news headlines is going full throttle with its own version -- and for good reason. As the voting public approaches a predicted record turnout for what is considered one of the most crucial elections of our time, many eyes will be set to social media and not just for Election Day. Federal law enforcement agencies have determined that hackers and foreign forces intervened in influencing the 2016 general election that placed President Donald Trump in power. Advertisements and false reports were splattered all over the Menlo Park company, prompting founder and chief Mark Zuckerberg to testify before a congressional panel.

Find out what's happening in Menlo Park-Athertonfor free with the latest updates from Patch.

Enough is enough, Facebook pledged.

Is there any doubt how watched this election will be?

Find out what's happening in Menlo Park-Athertonfor free with the latest updates from Patch.

In the last two years, Facebook developed a War Room that looks like it's something out of a modern-day Tom Clancy novel. It's comprised of two dozen experts who have honed skills in threat intelligence, data science, software engineering, research, community operations and legal teams. These experts are working in unison with more than 20,000 employees seizing a common mission -- to eliminate or reduce harmful threats spread on the Internet.

War Room specialists manage dashboards with real-time monitoring on key election issues such as voter suppression, floods of spam, foreign infiltration and unethical behavior labeled policy violations.

"Last month, we extended this policy further and are expressly banning misrepresentations about how to vote, such as claims that you can vote using an online app, and statements about whether a vote will be counted (e.g. 'If you voted in the primary, your vote in the general election won’t count.)," Public Policy Manager Jessica Leinwand said, further pointing out: "We already prohibit offers to buy or sell votes as well as misrepresentations about the dates, locations, times and qualifications for casting a ballot. We have been removing this type of content since 2016."

Here's an example Facebook provided:

The team also monitors news coverage. So when a White House staffer spews a falsehood or a former cabinet member is in legal trouble related to the election -- the Facebook team is aware of it and on the scene.

Case in point, the War Room detected a falsehood that Brazil's presidential elections were moved by a day because of national protests and removed the post as it was going viral within an hour.

"The work we are doing in the war room builds on almost two years of hard work and significant investments in both people and technology to improve the security on Facebook, including during elections," said Samidh Chakrabarti, director of product management in the Civic Engagement department. "We've increased transparency and accountability in our advertising. That said, security remains an arms race and staying ahead of these adversaries will take continued improvement over time."

Meanwhile, Facebook announced this week it is tweaking the News Feed "to prioritize high-quality content from friends and family" while working to "reduce the distribution of low quality content like false news and clickbait," spokeswoman Annie Demarest indicated.

To rank post in the a user's News Feed, Facebook looks at thousands of different data points or signals about the posts that have been shared by your community in this highly technical manipulation of algorithms.

Facebook's top 10 tips for spotting "false news":

  • Be skeptical of headlines.
  • Look closely at the link.
  • Investigate the source.
  • Watch for unusual formatting.
  • Consider the photos.
  • Inspect the dates.
  • Check the evidence.
  • Look at other reports.
  • Is the story a joke?
  • Some stories are intentionally false.

--Images courtesy of Facebook

Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.

More from Menlo Park-Atherton