After the January 6 riots at the Capitol, Meta decided to limit the political content displayed to users. The move is slowly reshaping political discourse on the world’s largest social media platform, even though the company has abandoned what is arguably the most effective method: turning off all content recommendations. politics.
Meta’s effort over the past 18 months to limit political content and divisive topics on the platform has just been mentioned in an internal document accessed by The Wall Street Journal.
Initially, Facebook overhauled the way it promotes political and health-related content. Many surveys show that users are tired of conflict, so CEO Mark Zuckerberg decided to prioritize posts containing content that users deem worthy of their time.
However, the leadership of Meta still finds that is not enough. At the end of 2021, tired of countless criticisms that Facebook was biased towards political content, Zuckerberg and his board decided to downgrade posts containing “sensitive” topics as much as possible in their news feeds. ie – an unprecedented initiative.
At the time, Facebook and Youtube were accused of being politically biased and commercially motivated to amplify hate and controversy. For years, advertisers and investors have pressed the company to “whitewash” its “messy” role in politics, according to the WSJ.
However, clearly, the plan to turn off political content will leave many unpredictable consequences. The report found that views for “high-quality news publishers” such as Fox News and CNN have decreased significantly, while content from less-trusted sources has increased. Many user complaints about misinformation are recorded, while charity donations through Facebook decrease significantly in the first half of 2022. Users certainly do not like these.
Internal analysis shows that Facebook could achieve some of its goals by limiting political, community and social content in the news feed, but this would be costly and inefficient.
By the end of June, Zuckerberg decided to withdraw the radical plan. Political controversy cannot be prevented by “blunt forcethe platform moves to slowly change the way the news feed spreads”sensitive topicsuch as health and politics.
“Over the years, we experimented with different approaches, eventually deciding to make small changes slowly but surely to limit political content and give users the experiences they wanted.,” said Dani Lever, a spokesperson for Meta.
The current approach has reduced the amount of content the user can see. Meta estimates politics accounts for less than 3% of all content views in the news feed, down from 6% around the time of the 2020 election.
According to Ravi Iyer, former Meta data science manager and executive director of the University of Southern California Institute of Technology Psychology, Meta’s current approach is very different from the interaction-focused model that Facebook and many other platforms use. other major social media platforms apply. Mr. Iyer said there should be more focus on how platforms allow certain content to go viral rather than making subjective decisions about what should be taken down.
“Letting employees judge good and bad often creates more problems than it solves. Our goal is to receive less criticism,” said Ravi Iyer.
According to experts, Meta’s recent move risks alienating some digital publishers. Courier Newsroom, an American media company, says it has struggled to gain traction on Facebook during the 2022 midterm elections.
Despite increasing the volume of articles by 14.5% in October, the company saw a 42.6% drop in unpaid impressions from a month earlier.
“Facebook remains one of the most powerful and far-reaching distribution platforms in the world. However, by further limiting the reach of trusted news publishers on the platform, Facebook will exacerbate the information crisis in the US that it has helped create.”, said Tara McGowan, representative of Courier Newsroom.
On the Mother Jones news site that focuses on socio-political issues, the total number of views on Facebook in 2022 is only about 35% compared to the same period last year. Monika Bauerlein, Mother Jones CEO said: “It sucks to realize how powerful a single tech company has over news that people can access.“.
According to Dan Bongino, an American conservative commentator, it is wrong for Facebook to change its strategy on political content, in part because the majority of older users will tend to care about issues. politics.
“They were able to take advantage of older users but couldn’t. Instead, they announced ‘I have this idea: Gather older users who like conservative content and remove them from the platform.,” said Dan Bongino.
According to the WSJ, Facebook has for many years maintained a conflicting relationship with politics. Rarely is a topic that makes users angry and interact on social networks so much.
Speaking in 2019, Zuckerberg defended the social media’s role in politics and society. “I believe people should be able to use our services to discuss a number of issues of interest, from religion to foreign policy and crime.,” the CEO said, adding that Facebook’s role in public discourse is healthy.
However, a report in 2021 shows that Facebook’s algorithm has long encouraged the creation of “infuriating, unreliable content“. After the 2020 election and the January 6, 2021 riots, the company came under increasing pressure that it was biased toward political content. Zuckerberg is tired of this problem.
“Politics is almost ingrained in everything. Users are fed up with politics, not only in the US but around the globe,” Zuckerberg said, adding that Facebook is looking to limit political content in the news feed.
In February of that year, Facebook announced that it would start showing less political content to “a small percentage of users” as an experimental measure. The goal is not to limit political discussion, but to “respect each person’s preference for it“. Trials have begun in the US, Brazil and Canada.
Six months later, the company says it has made some progress in identifying “posts that people find valuable“. The platform also claims to put less emphasis on shares and comments in the future.
According to experts, it is very positive for Facebook to address inaccurate information by reducing the spread of the platform, because they are not related to opinion censorship or content ranking reduction through the classification system. less accurate.
Internal data shows that content views in the news feed have dropped by nearly a quarter, while post comments have dropped by a quarter. Negative user reactions also dropped by nearly a quarter across the platform, while bullying, disinformation, and violent imagery were significantly curbed. However, while the majority of Facebook users thought their feeds had become “clean”, login rates fell by more than 0.18%.
Shortly after, users continued to complain about vulgar and misleading content. Many surveys show that they are increasingly recognizing Facebook as causing unhealthy social effects.
The Board of Directors and Zuckerberg accordingly decided to broadly remove citizen content from the news feed. However, turning off political content is not as easy as flipping the switch. The changes also do not directly address controversies arising from multiple hostile forums.
Depending on the combination of features, Facebook’s projected traffic to Fox News, MSNBC, New York Times, Newsmax, Atlantic, and The Wall Street Journal will drop between 40% and 60%. The impact could be diminished if publishers find a way to adapt.
However, even if Facebook has tried to block sensitive content, research shows that the platform has not been able to convince users that it is not politically malicious. This makes Zuckerberg once again rethink how to promote sensitive content.
“We’ve been thinking about the line between sensitive and non-sensitive content” said Iyer, former head of data science at Facebook.
#months #effort #Mark #Zuckerberg #change #stereotypes #Facebook