Facebook appears to be testing a new feature, that will establish the trustworthiness of its users.
Facebook appears to be testing a new feature that will establish the trustworthiness of its users. This was first reported by The Washington Post‘s Elizabeth Dwoskin, who claims that the new rating system has been developed over the last year to control fake news that has been plaguing Facebook. As per this metric, Facebook users who report on suspicious content would be rated on a scale of zero to one.
As per this report from The Washington Post, Facebook’s product manager Tessa Lyons acknowledged the testing of this feature. She said that Facebook had introduced this feature to combat fake news and it will allow the social network to analyse its users through the resulting scores. However, this will not be the only parameter of user evaluation.
Also read: Facebook Messenger Kids app now lets kids initiate friend requests themselves
As per Lyons, “A user’s trustworthiness score isn’t meant to be an absolute indicator of a person’s credibility, nor is there a single unified reputation score that users are assigned.” Instead, she explained that Facebook will look at behavioural clues as well, before it plans actions on bad actors, and publishers of fake news. Lyons was also reported saying there will be multiple reputation scores assigned to users depending on their tendency towards flagging content. Here, Facebook will consider both sides: the rating of trustworthy publishers, as well as of malicious actors.
Facebook’s latest move towards countering fake news comes as the social media platform looks to filter publishers on the basis of credibility. Twitter has also been cracking down on bots and individual accounts, that have been known to promote fake news and abusive content.