The Associated Press and other news organizations obtained copies of redacted documents presented by the former Facebook data scientist. These were the same documents presented to the United States Senate subcommittee hearing.
Frances Haugen, a former Data Scientist, presented redacted documents to the Securities and Exchange Commission through her legal counsel. Haugen also showed the same documents during the Senate Commerce, Science, and Transportation Subcommittee hearing on Consumer Protection, Product Safety, and Data Security on October 5, 2021.
The leaked documents, in general, purport to Facebook's lack of capacity or willful inability to mitigate their platform's impact towards real-life consequences. Aside from showing Facebook's failure to curb hate speech and misinformation in the US, it also showed a severely ineffective moderation of their Indian market.
According to AP, Facebook has been "selective in curbing hate speech, misinformation and inflammatory posts, particularly anti-Muslim content," in India. Moderating and fact-checking were also severely lacking in WhatsApp and Instagram; both platforms are owned and operated by Facebook Inc.
Made up of internal company reports, research as recent as March 2021, and internal memorandums as far back as 2019. The leaked documents showed that Facebook has struggled to moderate its platform's content in India effectively. India, with close to 400 million subscribers, is Facebook's largest growth market. It is also the biggest democracy outside the west and has become increasingly vulnerable to online misinformation and hate speech. The documents revealed that Facebook, for a while, has known that India is one of the "at-risk countries." "Yet, Facebook didn't have enough local language moderators or content-flagging in place," AP reported.
Aside from the lack of local moderators, the platform's recommended content feature and sharing algorithm also compounded misinformation and the circulation of hate speech. Similar to how QAnon users found each other on Facebook because of the "recommended" feature.
In a statement to the AP, Facebook's spokesperson claimed that the company had invested significantly in technologies to combat hate speech in various languages. These investments translated to a "reduced amount of hate speech that people see by half," the spokesperson added.
A report in July 2020 by Facebook's internal research team "showed the share of inflammatory content skyrocketed, starting in December 2019."
A report of part of the leaked redacted documents entitled "An Indian Test User's Descent into a Sea of Polarizing, Nationalistic Messages" chronicled a test user's account created in Facebook India. For the three weeks, the test account was kept active, India was shocked by a militant attack in the disputed Kashmir region which killed over 40 Indian soldiers. The report describes the test account's newsfeed as "a near-constant barrage of polarizing nationalist content, misinformation, and violence and gore." According to the leaked documents, the employee behind the test user's account identified "blind spots in the local language content."
Aside from misinformation, the leaked documents also showed the Facebook is plagued by the increasing number of anti-Muslim propaganda perpetrated by extremist Hindu groups.
Putting aside Facebook's supposed inability to address the issue of misinformation and other online problems that can translate to real-life violence, many critics point out a possible political agenda behind Facebook's lack of action in India. Experts and critics believe that Facebook is actively avoiding Bharatiya Janata Party's ire over strict censorship.
Jeff Horwitz of The Wall Street Journal reported that despite Facebook's internal research team constantly uncovering issues in the platform and recommending appropriate content moderation guidelines, the company still "didn't fix them." "The documents offer perhaps the clearest picture… of how broadly Facebook's problems are known inside the company, up to the chief executive himself," Horwitz added.