From liberals to “Moscow Mitch”, from conservatives to QAnon: Facebook researchers saw how its algorithms led to disinformation

0


Facebook researchers in 2019 created three fictitious accounts to study the platform’s technology for recommending content in the News Feed. The first was for a user in India, its biggest market. Then he created two more test accounts to represent a conservative US user and a liberal user.

All three accounts engaged exclusively with content recommended by Facebook’s algorithms. Within days, the liberal account, nicknamed “Karen Jones”, began seeing “Moscow Mitch” memes, which referred to a nickname given by critics of Republican Senator Mitch McConnell after blocking bills aimed at protecting US elections from foreign interference.

The conservative narrative, “Carol Smith,” was geared toward QAnon conspiracy theories. Meanwhile, the test user’s news feed in India was filled with inflammatory material containing violent imagery and graphics related to India’s border skirmishes with Pakistan.

The Facebook researcher who manages the Indian test user’s account wrote in a report that year: “I have seen more images of people who have died in the past 3 weeks than I have seen in the last 3 weeks. throughout my life ”, adding that“ the graphic content has been recommended by [Facebook] via featured groups, pages, videos and posts. “

1025-ctm-facebookpapers-segall3.jpg

CBS News


Internal Facebook memos Analyzing the progress of these test accounts were among the thousands of pages of leaked documents provided to Congress by lawyers for the Facebook whistleblower Francoise Haugen. A consortium of 17 US news agencies, including CBS News, reviewed the redacted version of documents received by Congress.

The three projects illustrate how Facebook’s News Feed algorithms can direct users to divisive content. And they reveal the company was aware that its algorithms, which predict which posts users want to see and the likelihood of them engaging, can lead users “down the path of conspiracy theories.”

In a statement to CBS News, a Facebook spokesperson said the project involving the conservative test user is “a perfect example of research the company is doing to improve our systems and has helped inform our decision to remove QAnon. of the platform “.

In 2018, Facebook changed the algorithms that populate users’ news feeds to focus on what it calls “meaningful social interactions” in an effort to increase engagement.

But internal research has found that engaging with posts “doesn’t necessarily mean a user actually wants to see more of something.”

“A state[d] The goal of the movement towards meaningful social interactions was to increase well-being by connecting people. However, we do know that many things that generate engagement on our platform leave users divided and depressed, ”a Facebook researcher wrote in a December 2019 report.

1025-ctm-facebookpapers-segall2.jpg

CBS News


The document, titled “We are responsible for viral content,” noted that users had indicated what type of content they wanted to see more of, but the company ignored those requests for “business reasons.”

According to the report, internal Facebook data showed that users are twice as likely to see content re-shared by others than content on pages they choose to like and follow. Users who comment on posts to express their displeasure are unaware that the algorithm interprets this as meaningful engagement and offers them similar content in the future, according to the report.

The newsfeed algorithm takes into account several metrics, according to internal Facebook documents. Each has a different weight and the content goes viral depending on how users interact with the post.

When Facebook first switched to meaningful social interactions in 2018, using the “Like” button awarded the post a point, according to a document. Signaling engagement using one of the reaction buttons with emoticons meaning “Love”, “Care”, “Haha”, “Wow”, “Sad” and “Angry” was worth five points. A reshared post was also worth five points.

Comments on posts, messages in groups, and RSVPs at public events gave the content 15 points. Comments, posts and shares including photos, videos and links received 30 points.

Facebook researchers quickly discovered that bad actors were playing with the system. Users were “posting increasingly outrageous stuff to get comments and reactions that our algorithms interpret as signs that we should let things go viral,” according to a December 2019 note from a Facebook researcher.

1025-ctm-facebookpapers-segall5.jpg

CBS News


In a November 2019 internal memo, a Facebook researcher noted that the “Angry,” “Haha” and “Wow” reactions are strongly linked to toxic and divisive content.

“We consistently find that sharing, anger and hahas is much more common over shoddy civic news, civic misinformation, civic toxicity, health misinformation and anti-ax health content,” the author wrote. Facebook researcher.

In April 2019, European political parties complained to Facebook that the newsfeed change was forcing them to post provocative content and take extreme political positions.

A Polish political party told Facebook that changes to the platform’s algorithm forced its social media team to go from half positive and half negative to 80% negative and 20% positive.

In a memo titled “Political Parties’ Response to Algorithm Change of 18,” a Facebook staff member wrote that “many parties, including those that have gone strongly negative, are concerned about the long-term effects on democracy ”.

In a statement to CBS News, a Facebook spokesperson said, “The goal of changing the ranking of meaningful social interactions is in the name: to improve people’s experience by prioritizing posts that inspire interactions, especially conversations between family and friends “.

The Facebook spokesperson also argued that the change in ranking is not “the source of divisions in the world,” adding that “research shows that some partisan divisions in our society have been developing for many decades, although before platforms like Facebook existed ”.

Facebook said its researchers are constantly running experiments to study and improve the algorithm’s rankings, adding that thousands of steps are taken before content is shown to users. Anna Stepanov, Facebook’s application integrity manager, told CBS News that the rankings feeding the news feed are changing based on new data from direct user surveys.

The documents say Facebook changed some of the rankings behind the News Feed’s algorithm after researchers’ comments. An internal memo from January of last year shows that Facebook reduced the weight of “angry” reactions by five points to 1.5. This was then lowered to zero in September 2020.

In February, Facebook announced that it was starting tests to reduce the distribution of political content in the News Feed for a small percentage of users in the United States and Canada. The program was expanded earlier this month to include other countries.


Share.

About Author

Leave A Reply