Facebook on Wednesday ramped up its battle against misinformation, taking aim at groups spreading lies and adding “trust” indicators to news feeds.

Moves outlined by Facebook vice president of integrity Guy Rosen were described as part of a strategy launched three years ago to “remove, reduce and inform” when it comes for troublesome content posted at the leading social network’s family of services.

“This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and  informing people with additional information so they can choose what to click, read or share,” Rosen said.

An array of updates included cracking down on misbehaving groups and those who run them, as well as making it harder to impersonate others.

The leading social network indicated it will be tougher on inappropriate content in groups, which may not be seen by the public but which can circulate hoaxes and promote abusive or violent actions.

See also  The only non-WP local politician Pritam Singh "likes" on Facebook is Tan Cheng Bock

When reviewing groups to decide whether they should be taken down, Facebook will more closely scrutinize what posts are approved by their administrators and which are rejected to determine whether social network standards are being violated.

Facebook will als add a “group quality” feature that provides an overview of content that has been flagged, removed or found to be false information, according to Rosen.

Starting Wednesday, if people in a group repeatedly share content deemed to be false by independent fact-checkers, Facebook will reduce that group’s overall news feed distribution, Rosen said.

The internet titan also launched a collaboration with outside experts to find more ways to quickly fight misinformation.

An idea Facebook has been exploring since 2017 involves enlisting members of the social network pinpointing journalistic sources to corroborate or contradict online content.

Facebook added a section to its Community Standards website where people can track updates made by the social network.

“Over the last two years, we’ve focused heavily on  reducing misinformation on Facebook,” Rosen said.

See also  Latest Facebook bug allowed apps to access up to 6.8 million users’ private photos

The “trust” indicators to be added to news feeds are developed  by a consortium of news organizations known as the Trust Project — which offer information on a news organization’s ethics and other standards for fairness and accuracy, according to Facebook.

Facebook also said it would seek to stop impersonations by bringing is “verified badge” to Messenger.

“This tool will help people avoid scammers that pretend to be high-profile people by providing a visible indicator of a verified account,” Rosen said.

gc-rl

© Agence France-Presse

ByAFP