The Facebook Papers: all the major revelations in a handy list

Do repost and rate:

Facebook has had many bad days and months, but the company is facing yet enough public disgrace. Everyone’s talking about Facebook Papers, and we’re here to summarize it for you, so you don’t have to spend days of your life reading. Let’s dive right into it.

What are Facebook papers?

Facebook papers are a set of documents, that former Facebook employee, and now a whistleblower, Frances Haugen, obtained before leaving the company. She submitted these documents to the Security Exchanges Commission (SEC) earlier this year, and now they are available to a consortium of news outlets.

Who’s Kathrine Haugen and what does she think about Facebook?

Haugen, 37, was a product manager at Facebook and a part of the Civic Integrity group, which worked on risks to elections including misinformation and bot accounts. She left the company in May, but collected a trove of information that is now called the Facebook papers.

In her interview with the show 60 minutes, she said “Facebook over and over again, has shown it chooses profit over safety.” You can read more about it here.

The Facebook papers have revealed a lot of information about what the company thinks about and how it deals with dwindling user numbers, misinformation, and its own image.

Below are some of the big problem areas for Facebook. We’ll keep updating the list as new details emerge.

Facebook, teens, and mental health

In the past decade, the number of teenage users on Facebook has steadily decreased. According to a report from Bloomberg, a recent internal study revealed that time spent by teens on the platform has declined 16% year-on-year. Facebook’s more popular among baby boomers these days.

Fewer teens are signing up for the services. That’s not surprising given the growth of other platforms such as TikTok.

A report from the Wall Street Journal published in September, said that the company ignored Instagram’s harmful impact on teens, particularly in regard to self-esteem and body image.

After these reports, Nick Clegg, Facebook’s VP of global affairs, went on to defend the firm and said that it’s working on a slew of new features — including a ‘take a break’ warning — for teen safety.

Facebook has failed to moderate hateful content in different countries

While Facebook’s base lies in the US, its biggest audiences are in countries like India and Brazil. As Casey Newton of The Platformer noted, these countries are placed in ‘tier zero’ — meaning they are a high-priority for its Civic Group formed in 2019 to monitor election interference.

But that doesn’t guarantee success. A New York Times article showed that the company struggles with problems like hate speech, misinformation, and the celebration of violence in India.

The report noted that the infestation of bots and fake accounts had a massive impact on the country’s national elections held in 2019.

A lack of budget allocation where it’s needed

The NYT report said that shockingly, 87% of the company’s budget to combat misinformation is allocated to the US, while the rest of the world has to make do with 13%.

This creates a massive resource crunch for monitoring countries that communicate in languages other than English.

In countries like Myanmar, which held its national elections last November, Facebook deployed tools to fight fake news, but they were ineffective.

The situation in other countries, which are slotted in tier 3 according to The Platformer, is more dire. In Ethiopia, despite knowing that Facebook is being used for violence, the company did little to stop that.

Facebook’s algorithm mishap

In her revelations, Haugen said that Facebook’s 2018 algorithm change was the culprit of inciting hate between its users. When it was rolled out, Zuckerberg said that it meant to increase interactions between friends and family.

However, this change backfired, as the feeds become angrier, resulting in increased toxicity and misinformation.

Make positive stories visible

As a way to combat its algorithm failures, and improve its image, Facebook began to increase the visibility of positive stories about itself after a meeting in January. Plus, the social network began to distance Zuckerberg from controversial topics such as vaccine misinformation.

In 2018, the company even ran an experiment of turning off its News Feed algorithm for select people. That also led to worse experiences for many people with less engagement and more ads.

Employees are unhappy with the direction

Haugen is probably one of the pristine examples of how Facebook employees are not fine with the way the social network is operating. In her interview, she said that the situation in the firm was substantially worse than “anything I’d seen before.”

A report from the Wall Street Journal noted that the firm ignored employees’ warnings about pages and groups run by global drug cartels and human traffickers.

“We’re FB, not some naive startup. With the unprecedented resources we have, we should do better,” said an employee after the Capitol Riots in the US in January, as per a report by Politico.

The story has numerous quotes from former and current Facebookers, who were angry about the way the firm was handling misinformation across the board.

A report from Wired echoed what some former employees have said before: the company focuses on engagement a lot.

It also highlighted that the critical teams that engage with misinformation directly, such as the content policy team, don’t have the power of other teams like the public policy team to carry out swift actions.

Zuckerberg’s response

While the company agreed earlier this year to keep King Zuck out of sight for controversial issues, this is too big to ignore.

In the company’s quarterly earnings call, the head of the company addressed this issue and said the media is misinformed about the firm:

Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company.

Want to find out more?

This is by no means an exhaustive list of all the issues uncovered by the Facebook papers. So here’s a reading list for you to dive deeper into it:

  • Where it started: Wall Street Journal’s reporting of Facebook Files.
  • Gizmodo’s reporting on Facebook’s climate change denier problem.
  • How Facebook is facing huge problems with teen user retention.
  • The US lawmakers are keeping an eye on the social network’s algorithm and its impact on the world.
  • Facebook’s sales growth has slowed down, thanks to Apple’s new privacy policy.
  • Politico’s report on how little the company did to reduce violent posts in numerous countries.

Regulation and Society adoption

Ждем новостей

Нет новых страниц

Следующая новость