Fb Dials Down the Politics for Customers

Data released by Facebook last fall showed that during a week in October, seven of the top ten most engaged sites were mostly political in nature, including those of President Donald J. Trump, Fox News, Breitbart and Occupy Democrats.

Three years ago, Facebook announced that it would reduce the amount of content posted on the site by news publishers and brands. A revision that, according to the company, put the focus more on the interaction between friends and family. At the time, Mr. Zuckerberg said he wanted to make sure that Facebook’s products “are not only fun but good for people too”. He also said the company would take these measures even if it hurt the bottom line.

Still, Facebook users had no problem finding political content. Non-governmental organizations and political action committees paid to deliver targeted political advertisements to millions of Americans in the months leading up to the November presidential election. Users have created a large number of private groups to discuss campaign issues, organize protests, and support candidates. Until recently, Facebook’s own systems often suggested new, diverse political groups that users could join.

Facebook has been tracking some of these over the past few months. After polls closed on election day, the company stopped buying new political ads. And after the deadly January 6 riot at the Capitol, Zuckerberg said the company would be turning off the ability to recommend political groups to lower the temperature during global talks.

Under the new test, a machine learning model will predict the likelihood that a post – whether it is published by a major news organization, a political expert, or your friend or relative – is political. Posts that are classified as political appear less often in users’ feeds.

It’s unclear how the Facebook algorithm defines political content or how much the changes will affect people’s feeds. Lauren Svensson, a Facebook spokeswoman, said the company will “continue to refine this model during the trial period to better identify political content, and we may or may not use this method over the longer term.”

It’s also unclear what will happen if Facebook’s tests reveal that reducing the political content also decreases people’s use of the site. In the past, the company has postponed or changed changes to algorithms to reduce the amount of misleading and divisive content after finding that the changes resulted in Facebook opening less often.

Comments are closed.