Zuckerberg Defends Allegations That News Feed Swayed US Voters

In the wake of the Tuesday’s results that saw Donald Trump elected the 45th US President, the majority of voters are grasping for an explanation as to how this could have happened.

Among those theories is a widespread notion that Facebook’s troubled news feed algorithm played a role in swaying and cementing voters with fake news–presumably to the Trump camp.  Facebook Founder and CEO Mark Zuckerberg is publicly denying the notion.

Headlines are splashing through the tech world: Forbes’ “How Facebook Helped Donald Trump Become President”, or Vox’s “Mark Zuckerberg is in denial about how Facebook is harming our politics” and Mashable’s “Zuckerberg claims Facebook hoaxes didn’t influence the election. He’s wrong.” 

Zuckerberg took the stage at Techonomy16 to address these growing accusations, agreeing that more can always be done to improve the quality of the News Feed’s content, but that Facebook could not have influenced the outcome of the election.

“Personally, I think the idea that fake news on Facebook, of which it’s a very small amount of the content, influenced the election in any way is a pretty crazy idea,” Zuckerberg said.

He continued, pointing to back to the results as evidence themselves.

“Part of what I think is going on here is people are trying to understand the result of the election, but I do think that there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw some fake news. If you believe that then, I don’t think you have internalized the message that Trump supporters are trying to send in this election.”

He also pointed to content engagement as part of the problem. Zuckerberg noted that Trump’s posts got more engagement than Clinton’s, meaning Facebook’s algorithm promoted Trump posts more simply because more people were organically sharing, liking and commenting.

Sadly Zuckerberg is not taking responsibility for his flawed algorithm, instead defending this snowball strategy. Facebook research shows that almost everyone on the site is connected with someone with opposing ideologies. Facebook is clearly interested in keeping people on the site longer, and therefore also hides posts it believes contribute to a negative experience. That too is also inherently flawed in the context of a society which values transparency and objectivity.  How then, should Facebook interfere when it’s users see a post they disagree with? Hiding them, it seems, isn’t the answer.

Facebook is making efforts to involve more humans in the creation of its content ranking algorithms. There is now a human quality panel used to hone rankings.

“We’re always working to better understand what is interesting and informative to you personally, so those stories appear higher up in your feed,”  explained Adam Mosseri, VP, Product Management, News Feed. “And we work hard to try to understand and predict what posts on Facebook you find entertaining to make sure you don’t miss out on those.”

He also says Facebook is as inclusive as possible, not favouring one idea or viewpoint over another.

“We are not in the business of picking which issues the world should read about,” Mosseri writes. “We don’t favour specific kinds of sources — or ideas.”

Unfortunately we know that is not entirely true:

Zuckerberg Taking Heat For Censoring Historic Photo