A former Facebook employee who revealed the company's business practices has revealed their identity in an interview with US TV show '60 minutes'.
Frances Haugen was a product manager with the social-media giant and assisted with Facebook's civic misinformation team. The civic information team was used to tackle harmful, misleading and inappropriate content on the platform.
In documents earlier revealed by Haugen, she said that Facebook overlooked misinformation and misleading content because it could negatively impact their financial income and that the Facebook algorithm pushed people towards negative content.
Haugen said in the interview with 60 minutes:
"[I] became increasingly alarmed by the choices the company makes prioritizing their own profits over public safety — putting people's lives at risk. As a last resort and at great personal risk [revealed these documents]."
Haugen says the systems at Facebook were much worse than at other online companies she worked for and that after the 2020 US presidential election, Facebook reverted back to its old way of doing things, despite promising to change. She added:
"As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me."
Haugen also called for stricter controls to be placed over Facebook, saying:
"Facebook has demonstrated they cannot act independently. Facebook, over and over again, has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety. I'm hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place. That's my hope."
Facebook responded to the comments, saying:
"Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."
[h/t: MSN]
COMMENTS