Facebook employees have formed an unofficial task force to question the role their company played in promoting fake news in the lead-up to Donald Trump's victory in the US election last week, amid a larger, national debate over the rise of fake and misleading news articles in a platform used by more than 150 million Americans.
The task force, which sources tell BuzzFeed News includes employees from across the company, has already rebutted a statement made by Facebook CEO Mark Zuckerberg at a conference last week that the argument that fake news on Facebook affected the election was "a pretty crazy idea."
"It's not a crazy idea. What's crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season," said one Facebook employee, who works in the social network's engineering division. He, like the four other Facebook employees who spoke to BuzzFeed News for this story, would only speak on condition of anonymity. All five employees said they had been warned by their superiors against speaking to press, and feared they would lose their jobs if named.
More from Buzzfeed:
This woman got banned from Facebook for sharing an article about breasts
Google links to a fake site as top Election news result
Russian State television is totally here for President-Elect Trump
The employees declined to provide many details on the task force. One employee said "more than dozens" of employees were involved, and that they had met twice in the last six days. At the moment, they are meeting in secret, to allow members of the group to speak freely and without fear of condemnation from senior management. The group plans to formalize its meetings and eventually make a list of recommendations to Facebook's senior management. Another Facebook employee said while the task force remained small, "hundreds" of Facebook employees had expressed dissatisfaction with the company's stance on fake news in private online chats, and wanted to support efforts to challenge that position.
Facebook did not respond to a request for comment from BuzzFeed News, but in a statement to media last week, a Facebook spokesperson said, "While Facebook played a part in this election, it was just one of many ways people received their information – and was one of the many ways people connected with their leaders, engaged in the political process and shared their views."
In the wake of Trump's victory, Facebook has been facing questions over its responsibility in spreading misinformation or failing to clamp down on the sharing of fake news. Almost half of adult Americans rely on Facebook as a source of news, a recent study by the Pew Research Center found. A recent report by BuzzFeed News found that the three large left-wing pages published false or misleading information in nearly 20% of posts, while the three big right-wing Facebook pages published it 38% of the time. The report concluded: "The best way to attract and grow an audience for political content on the world's biggest social network is to eschew factual reporting and instead play to partisan biases using false or misleading information that simply tells people what they want to hear."
Or, as former Facebook designer Bobby Goodlatte wrote on his own Facebook wall on November 8, "Sadly, News Feed optimizes for engagement. As we've learned in this election, bullshit is highly engaging. A bias towards truth isn't an impossible goal. Wikipedia, for instance, still bends towards the truth despite a massive audience. But it's now clear that democracy suffers if our news environment incentivizes bullshit."
Those involved in the task force who spoke to BuzzFeed News said they were reviewing whether Facebook had devoted enough resources to responding to user reports of fake news, or whether it had effectively used features that automatically scan the platform for certain kinds of offensive content.
"There is a lot more we could be doing using tools already built and in use across Facebook to stop other offensive or harmful content," said a second Facebook employee who has been a longtime engineer there. "We do a lot to stop people from posting nudity or violence, from automatically flagging certain sites to warning people who post content that doesn't meet the community guidelines," the employee said. He added that while Facebook users were encouraged to flag fake news, the guidelines for removing that sort of content were not clear. "If someone posts a fake news article, which claims that the Clintons are employing illegal immigrants, and that incites people to violence against illegal immigrants, isn't that dangerous, doesn't that also violate our community standards?"
In a January 2015 update, Facebook promised fewer fake news stories by giving users a tool to self-report fake stories on their feeds. In the nearly two years since, Facebook has refused to give reporters access to metrics showing how often fake news is reported and/or removed from their platform. A BuzzFeed News report found that even when fake news was flagged on one Facebook page, it was simply moved to another.
Facebook came under fire in May amid accounts by recent employees that personal bias was pushing the platform's Trending Topics section towards more liberal news sites. In the wake of the report, Facebook fired its entire Trending Topics team, and said it would instead rely on an algorithm — though as BuzzFeed News reported, experts warned that algorithms were likely to increase the spread of fake news.
The months that followed saw Facebook court controversy by removing the Pulitzer Prize-winning photo of a 9-year-old girl fleeing napalm bombs during the Vietnam war for violating nudity standards, as well as removing a breast cancer awareness video for similar reasons. As BuzzFeed News reported, the removal of the posts was repeatedly blamed on a system error, and they were eventually reinstated.
"Facebook, by design, by algorithm, and by policy, has created a platform that amplifies misinformation," said Zeynep Tufekci, an associate professor at the University of North Carolina at Chapel Hill, who has been vocal in the need for social media to consider their role in spreading fake news.
Two Facebook employees, both of whom work as managers, told BuzzFeed News that the reporting of fake news was not taken as seriously as reports about other content that violates community standards. Both said that when they raised the issue in the past, they were told that the "top executives" at Facebook were discussing the problem and taking it seriously. One said she had made it clear that even though users were flagging certain media as being fake, or a hoax, the company was not taking a strict enough line with removing that content or defining what constituted fake news. She said she had passed on that concern to more senior managers, but has not received a response.
"People know there are concerned employees who are seeing something here which they consider a big problem," she said. "And it doesn't feel like the people making decisions are taking the concerns seriously."
The New York Times reported last week that top executives at Facebook have discussed the issue, and that a private Facebook post shared by Zuckerberg last week to a small group of friends and colleagues refuted the idea that Facebook had influenced the elections.
Citing statistics about low voter turnout, Zuckerberg wrote, "So rather than focusing on strengths or weaknesses in specific demographics, or other factors that may have pushed this race in one direction or another, these stats clearly suggest what many people have said all along. Both candidates were very unpopular."
The Facebook employees who spoke to BuzzFeed News stressed that they were not trying to "take sides" over who won the elections, but make an argument that Facebook should not be bolstering any political candidate through fake or misleading news items.
"If someone is right-wing, and all their friends are right-wing, and that is the news they share on Facebook, then that is the bubble they have created for themselves and that is their right," said the longtime Facebook engineer. "But to highlight fake news articles in the [news] feed, to promote them so they get millions of shares by people who think they are real, that's not something we should allow to happen. Facebook is getting played by people using us to spread their bullshit."
0 commentaires:
Enregistrer un commentaire