Social Media

Facebook Launches a Series of Tests to Inform Future Changes to Its News Feed Algorithms

Facebook announced this morning it will be rolling out a series of News Feed ranking tests that will ask users to provide feedback about the posts they’re seeing, which will later be incorporated into Facebook’s News Feed ranking process. The announcement after being grilled by lawmakers about the role that Facebook played in the attack on the U.S. Capitol.

Specifically, Facebook will be looking to learn which content people find inspirational, what content they want to see less of, and what other topics they’re generally interested in, among other things. This will be done through a series of global tests, one of which will involve a survey directly beneath the post itself which asks, “How much were you inspired by this post?”

The tests will be carried out to help to show more people posts of an inspirational nature closer at the top of the News Feed. Another test will work on the Facebook News Feed experience to reflect what people want to see.

Today, Facebook prioritizes showing you content from friends, Groups, and Pages you’ve chosen to follow, but it has algorithmically crafted an experience of whose posts to show you and when based on a variety of signals. This includes both implicit and explicit signals — like how much you engage with that person’s content regularly, as well as whether you’ve added them as a “Close Friend” or “Favorite.”

Today, users can still scroll News Feeds that reinforce their views, no matter how problematic. And with the growing tide of misinformation, the News Feed has gone from just placing users into a filter bubble to presenting a full alternate reality for some, often populated by conspiracy theories.

This test from Facebook doesn’t necessarily tackle this problem head-on, but instead looks to gain feedback about what users want to see, as a whole. Facebook says that it will begin asking people whether they want to see more or fewer posts on certain topics, like Cooking, Sports, Politics, and more. Based on users’ collective feedback, Facebook will adjust its algorithms to show more content people say they’re interested in and fewer posts about topics they don’t want to see.

The area of politics, specifically, has been an issue for Facebook. The social network for years has been charged with helping to fan the flames of political discourse, polarizing and radicalizing users through its algorithms, distributing misinformation at scale, and encouraging an ecosystem of divisive clickbait, as publishers sought engagement instead of fairness and balance when reporting the news.

There are now entirely biased and subjective outlets posing as news sources who benefit from algorithms like Facebook’s. Now, the company says it will work to better understand what content is being linked to negative News Feed experiences, including political content. In this case, Facebook may ask users on posts with a lot of negative reactions what sort of content they want to see less of.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *