Facebook released the interim results of an audit into its alleged anti-conservative bias Tuesday, the first step in attempting to address criticism by the right that the social media giant has stifled its voice.
The report, conducted by a Republican former senator and a law firm, did not present any evidence of bias, although it did conclude that Facebook has the potential to restrict free speech.
It also detailed, at length, concerns from conservatives, including over the rejection of ads and a lack of transparency in how Facebook removes content.
The report also outlined some changes Facebook has made, including loosening its policies around shocking and sensational content that would allow for some anti-abortion ads showing infants born before full term.
“Facebook’s policies and their application have the potential to restrict free expression,” concluded the report, authored by former U.S. Sen. Jon Kyl, R-Ariz., and the law firm Covington and Burling. “Given the platform’s popularity and ubiquity, this is a danger that must be taken very seriously.”
Conservatives long have claimed that major social media sites exhibit political bias, pointing to Silicon Valley’s liberal leanings and companies’ regular political campaign contributions to Democrats.
Tech executives frequently have pledged to treat all political content equally.
But those promises have failed to sway the country’s most prominent Republicans, including President Donald Trump, who repeatedly has claimed that Facebook, Google and Twitter are biased against the party.
ARTICLE CONTINUES BELOW ADVERTISEMENT
The report says Kyl and his team met with 133 conservative politicians and organizations to produce the initial review. Kyl said he was approached by Facebook shortly after the company’s CEO, Mark Zuckerberg, testified before the Senate Judiciary and Commerce committees in April 2018.
The report said Facebook agreed to make changes in response to several issues raised during the audit process.
Notably, the company changed an advertising policy that resulted in the prohibition of photographs showing medical tubes connected to the body, after anti-abortion groups complained that the rule prevented them from publishing advertisements that focused on stories of infants born prematurely.
The new policy allows for such depictions unless the human connected to the tubes is in visible pain or distress, or when blood or bruising is apparent.
Among the other changes the report noted were the creation of an oversight board for content decisions, the hiring of staffers to work with right-of-center organizations and leaders, and the introduction of an appeals process for content removed for violating the company’s community standards.
It also included the implementation of the “Why am I seeing this post?” feature, meant to help users understand why certain content appears in their news feeds.
Nick Clegg, Facebook’s vice president of global affairs and communications, said in a company blog post that Facebook is committed to providing a forum for all users, regardless of their political views.
“We take accusations of political bias made against us extremely seriously,” Clegg said. “Our policies, and how we apply them, can have a huge impact, so we have a responsibility to apply them evenly, without favoring one side or another and without devaluing the principle of free expression.”
ARTICLE CONTINUES BELOW ADVERTISEMENT
Neither side of the political spectrum appeared happy with the initial conclusions, according to some responses Tuesday afternoon.
“Facebook’s impulse to appease right-wing cries of bias, despite all evidence to the contrary, is yet again putting Facebook in a position where it’ll be amplifying lies and enabling extremists, white supremacists and Proud Boys at the expense of American democracy and with great risk to our safety,” Angelo Carusone, president of the liberal media watchdog group Media Matters for America, said in a statement.
On the Republican side, Sen. Josh Hawley, R-Mo., called the report “a smokescreen disguised as a solution.”
“Facebook should conduct an actual audit by giving a trusted third party access to its algorithm, its key documents and its content moderation protocols,” Hawley said in a statement. “Then Facebook should release the results to the public.”