Social media giant Facebook has announced some updated today in order to reduce the reach of harmful content on the platform. Among the most notable changes that have been introduced have been made to Groups. Going forward, the reach of Facebook Groups that “repeatedly share misinformation” will be reduced so that fewer number of people get to see content from them on their News Feed.
It is worth adding that this is an important change as at the time of 2016 US elections, it was the group pages that were used to distribute propaganda and misinformation.
Besides this, another change has been introduced to cut off low-quality publishers from users’ News Feeds. Facebook will now be measuring is the publisher is only popular on its platform or is well-read otherwise too. And the platform will take this into consideration to determine whether or not they deserve News Feed promotion. This will let Facebook decide whether a publisher is valued at a broad level or if it has just figured out how to rank well on the News Feed.
Furthermore, Facebook is making changes around fact checking stories. While the Associated Press will be checking facts for some videos in the US, Facebook will be adding ‘Trust Indicators’ for users. These indicators will come from The Trust Project, that is a group which is built by news organisations that make those determinations.
After 2016 elections, Facebook has been working to remove misinformation and propaganda from its platform. It has launched its fact-checking tools, limiting the spread of stories that can cause problems, and flagging news as fake. Today, Facebook has added the following new changes to its platform:
- Expanding the News Feed Context Button to images.
- Adding Trust Indicators to the News Feed Context Button on English and Spanish content.
- Adding more information to the Facebook Page Quality tab.
- Allowing people to remove their posts and comments from a Facebook Group after they leave the group.
- Combatting impersonations by bringing the Verified Badge from Facebook into Messenger.
- Launching Messaging Settings and an Updated Block feature on Messenger for greater control.
- Launched Forward Indicator and Context Button on Messenger to help prevent the spread of misinformation
Speaking on this development, Facebook said in a blog post, “For the past two years, for example, we’ve been working on something called the Safe Communities Initiative, with the mission of protecting people from harmful groups and harm in groups. By using a combination of the latest technology, human review and user reports, we identify and remove harmful groups, whether they are public, closed or secret. We can now proactively detect many types of violating content posted in groups before anyone reports them and sometimes before few people, if any, even see them.”