These Facebook posts may look legitimate but they were part of a disinformation campaign coordinated by Iranians and recently uncovered by the company.As the U.S. prepares for its midterm elections, the social network giant is working to reduce misinformation, while also keeping Facebook a place for people to share.
A lot of the work of content moderation for us begins with our company mission which is to build community and bring the world closer together.As part of its step-up efforts, Facebook and lists fact-checkers and takes down misinformation that contributes violence,but popular content, often dubbed viral, is frequently the most extreme.Facebook says it manages to reduce future views of post.It deems incorrect by 80 percent, but it’s not easy to draw the line. I mean if you think about it they get what millions, billions of new posts a day, most of them, some factual claim or some sentiment that nobody’s ever posted before.And so to go through those and figure out which of these are actually misinformation by any definition,you know, which of these are false, which of these are intended to manipulate an electoral outcome——that’s a huge challenge.There isn’t a human team that can do that in the world. There isn’t a machine can do that in the world.
Keller works at Stanford and used to be on Google’s legal team.She says kidding each decision right is impossible, and that Facebook is ahead of others in terms of making its process transparent.But it’s still struggling. VICE News recently posed as all 100 senators and bought fake political ads. Facebook which approved them all said it made a mistake. Politicians in Britain and Canada have asked Mark Zuckerberg, Facebook CEO, to testify on Facebook’s role in spreading disinformation.
We need to understand that it is built into the system that there will be a fair amount of failure and we need things like appeal processes and transparency to address that. Since the 2016 US presidential campaign,Facebook has also been working on showing users more what happens behind the scenes, as it beefs up moderating content on its site.
Having a system that people view as legitimate and basically fair even when they don’t agree with any individual decision that we’ve made is extremely important. We talk to many prominent groups around the world. We also try to go local as much as we can to get voices, to that close to the ground, because we realize what matters most is how the rules are going to impact users. And we want to hear those voices.While the US election is about issues such as immigration and the economy, it will also be a test of how well Facebook’s news systems are working, and what the social networking giant needs to do going forward.
Michele Quinn, VOA News, San Francisco California.