The social media giant’s former product manager launch a fresh attack on CEO Mark Zuckerberg over his alleged complacency on online hate, following a series of bombshell media interviews and testimony before a Senate committee. Zuckerberg has dismissed recent criticism as “deeply illogical.”
“I came forward now because now is the time to act,” Haugen said. “The failures of Facebook are making it harder to act.”
Haugen said Facebook cis “unquestionably” making hate speech online worse as groups push people towards extreme content. She said the company was “negligent” in its inability to accept the consequences of its actions.
The parliamentary committee is charged with examing the British government’s “Online Harms Bill” which aims to shift more responsibility onto social media firms for harmful content on their platforms - and punishing them if they fail to tackle it.
The Facebook Oversight Board released a statement calling for a “full, independent, outside investigation into Facebook” following the publication of internal Facebook documents and Haugen’s testimony.
“No criminal should appoint its own judge and jury, as Facebook has done with its Oversight Board,” the statement said.
The live updates for this blog have ended.
In Haugen’s files, documents exposed how hate speech and terrorist content is perpetuated in the Middle East because the company does not have enough moderators who speak local languages and understand cultural nuance.
According to one document, Facebook over-relies on artificial-intelligence filters that make mistakes, leading to “a lot of false positives and a media backlash.”
During her testimony before a British Parliamentary committee Monday, Haugen said 76 percent of counterterrorism speech in one at-risk country was flagged as terrorist speech and removed.
According to the Associated Press, the company has tried to hire staff who spoke local dialect; however, Arabic content moderation “still has a long way to go,” the company said.
FULL STORY: Facebook Knew How to Combat Hate Speech in Middle East, Did Nothing: Whistleblower
In a statement, the Board says it “rejects the premise that the Facebook Oversight Board can ever be considered independent” and call for an investigation into the allegations raised in the Facebook Files, The Facebook Papers and recent SEC filings.
The Board demands policymakers in the U.S., U.K. and European Union “fast-track legislation to ensure permanent, independent oversight of Facebook.”
“No criminal should appoint its own judge and jury, as Facebook has done with its Oversight Board,” the statement said.
The statement says the released internal documents and the testimony from whistleblower Frances Haugen that had “laid bare the extreme harm and devastating impacts of Facebook.”
It calls Facebook “an international criminal enterprise” for ignoring internal warnings and lying to regulators and its own oversight board. It also said the company is “in the thrall od right-wing extremists” and welcomes “insurrectionists, racists and disinformation artists onto their platforms under the guise of free speech.
She said that engagement-based ranking allowed hate speech to spread on the platform and incited real-world violence in places like Ethiopia and Myanmar.
“They allow the temperature in these countries to get hotter and hotter and hotter when the pot starts boiling over, they’re like, ‘Oh no, we need to break the glass, we need to slow the platform down,’” she said.
According to internal documents, Facebook only removes about 5 percent of hate speech and its AI ignored warnings of the spread of hate speech due to language gaps.
Haugen said the AI system will fail and Facebook needs to slow down its platform “to a human scale” and should be forced to disclose what its integrity systems are and how well safety systems work in non-English languages.
“Ethiopia has 100 million people and speaks six languages. Facebook only supports two,” she said. “If we believe in linguistic diversity, the current design of the platform is dangerous.”
She said there is a conflict of interests between different departments within the company.
For example, between the safety and integrity team and the growth team.
She said there has to be better organizational oversight because teams may be “unknowingly” promoting harmful content.
Right now, she said there are no “incentives forcing Facebook away from shareholder interests.”
“They will choose growth over safety,” she said.
Haugen added that “countless employees said we have lots of solutions that don’t involve picking good and bad ideas.”
“It’s not about censorship,” she said. “We could have a safer platform but it will cost little bits of growth.”
A British MP said there is evidence that Facebook products are changing kids’ brains.
Haugen also noted how social media has an addictive quality that makes kids unhappy while using it but makes them feel that they can’t leave. It also “allows bullying to follow children home.”
“It’s super scary to me that we are not taking a safety-by-design approach,” Haugen said.
She added that unlike regulations on pharmaceutical companies, Facebook has “never proved their product is safe for kids.”
Rather, she said Facebook is “negligent” and “ignorant.”
She said Facebook is full of “overwhelmingly” good people.
“Good people embedded in a system with bad incentives are led into bad actions,” she said.
She added that people “willing to look the other way” are promoted over people who are “willing to raise alarm.”
Haugen said Facebook is “unwilling to acknowledge its own power” and “won’t accept the consequences of its actions.”
“The Oversight Board should ask the question ‘why was Facebook able to lie to them in this way?’” she said.
“If Facebook can come in there and actively mislead the Oversight Board, which is what they did, I don’t know what the purpose of the Oversight Board is.”
She added that “hate and anger is the easiest way to grow on Facebook.”
FULL STORY: Whistleblower Says Facebook Is ‘Unquestionably Making Hate Worse’
She said she “had no idea how to escalate” national security concerns she saw when she worked on counterespionage.
“Because I didn’t have faith in my chain of command because they had dissolved [the Civic Integrity Team],” she said.
She said employees were told to “accept” that the company was under-resourced.
“There are no incentives, internally, to rally for help because everyone is under water,” Haugen said. “Facebook’s most important teams are understaffed and under-resourced.
Haugen said this current system is “biased towards bad actors and biased towards people who push people to the extremes.”
“Currently there’s no accountability,” she added.
Haugen recommends that Facebook provide its own group moderators once a group reaches a certain size.
Requiring the company to employ moderators to regulate content and the impact of large groups would increase accountability, Haugen suggested.
She also praised Google and Twitter for being more “transparent” in how they disclose “experiments” and data about its user’s habits.
“I came forward now because now is the time to act,” she said. “The failures of Facebook are making it harder to act.”
“Facebook has been going through a bit of an authoritarian narrative spiral, where it becomes less responsive to employee criticism, to internal dissent and in some cases cracks down upon it,” Sophie Zhang, a former Facebook data scientist, told the Associated Press. “And this leads to more internal dissent.”
Zhang was fired from Facebook in the fall of 2020. Last year, she accused the company of ignoring fake accounts used to undermine foreign elections.
Another employee, whose name was redacted from internal documents sent to Congress, said employees feel “powerless.”
“I have seen many colleagues that are extremely frustrated and angry, while at the same time, feeling powerless and [disheartened] about the current situation,” the employee wrote on an internal message board after Facebook chose to leave up posts from former President Donald Trump that suggested Minneapolis protesters could be shot. “My view is, if you want to fix Facebook, do it within.”
One example highlighted in the report is when Instagram banned #AlAqsa under rules on terrorism content despite it being used to exchange vital information about people attacked in Al-Aqsa Mosque in the Old City of Jerusalem.
In a statement to the Associated Press, a Facebook spokesperson said that over the last two years the company “has invested in recruiting more staff with local dialect and topic expertise to bolster its review capacity around the world” but admitted it was not as strong as it could be on the issue.
He is shown riding a wave of dollar bills while surrounded by a distressed-looking teenager - a reference to Facebook’s alleged strategy of prioritizing profits over the wellbeing of its users.
Campaigner Flora Rebello Arduini said that younger users “don’t stand a chance” against the company’s algorithms.
FULL STORY: Mark Zuckerberg Says Claim Facebook Pushes ‘Angry Content’ for Profit ‘Deeply Illogical’
The company has kept tight-lipped about the details of the report but those following the whistleblower will be looking for any impact criticism of the company in recent months has had on earnings.
The report will be released later this morning.
A senior Facebook employee recently raised an issue about possible new laws in a meeting - divulging details that only a few officials knew about at the time.
It criticized Facebook, suggesting it is failing to act to tackle harmful posts.
It is expected to be highlighted at the hearing later today as evidence that Facebook is not adequately focusing on harmful posts outside the States.
British investigative journalist Carole Cadwalladr expressed her outrage at the figures.
Follow Newsweek’s liveblog throughout Monday for all the latest.