;

Elections are scheduled in different Asian countries this year, such as Indonesia, India and the Philippines, and possibly even Singapore. In view of this, Facebook has promised to take a tougher stance on “fake news” on its social media platform and also its highly popular messaging application, WhatsApp.

However, the question remains as to whether new regulations from Facebook will succeed in preventing disinformation from spreading wildly and affecting the results of upcoming electoral contests, the South China Morning Post (SCMP) reports.

India will be holding General Elections from April to May of this year. WhatsApp this week set limits for forwarded messages in part due to the clamor to clamp down on fake news spread via the app that led to lynchings in this country in 2018. Facebook, for its part, said that it would be enforcing more stringent rules in the country regarding interference in the polls as well as political advertising.

The social media giant is considering imposing the same restrictions in other Asian countries, depending on each nation’s laws and threats. For the last year and a half, the company has had teams working on the ground in every nation in the region going to the polls this year: Thailand, India, the Philippines, Indonesia and Australia, SCMP reports Katie Harbath as saying. Harbath, the director of global politics and government outreach for Facebook, says that the polls are “very much a top priority” for the company, and that stricter rules will be imposed this year.

See also  Leong Sze Hian banned from sharing on Facebook

“We have teams working to prevent election interference on our services. This includes detecting and removing fake accounts, limiting the spread of misinformation, tackling coordinated abuse, preventing foreign interference, and bringing more transparency and accountability to advertisers. We have made big investments to make sure we are prepared to handle whatever might happen.”

Critics seem unsure. SCMP also reports that Maria Ressa, award-winning journalist and founder of the Philippine news website Rappler, has taken a wait-and-see stance. “Will Facebook take on the fundamental responsibility of gatekeeper, now that it is the world’s biggest distributor of news? Think about it: when journalists were the creators and distributors of news, we took that role seriously. At least you had the facts, and that’s what protected democracy. When tech took over, they shied away from that responsibility. They let lies spread faster than truth. In the real world that’s a crime, in the virtual world that’s impunity.”

Furthermore, Ressa opined that yesterday’s journalists have been replaced by content moderator who “at best… have no cultural context of the material that they’re being given seconds to evaluate.”

One thing that Facebook has done is to employ more fact-checkers in Pakistan, India, Indonesia, and the Philippines, as well as launch campaigns to fight disinformation in these nations, as well as in Singapore and Thailand.

See also  Facebook to crack down on groups spreading misinformation

Harbath said, “We now have teams that understand the nuances of the election and the context of the region…. We’ve also invested heavily in digital literacy to make sure that we are helping people understand what they’re seeing online.”

Facebook has come under fire in Asia due to the use of its platforms for promoting disinformation, propaganda and smear campaigns. The company has raised the number of its employees working in content security from 10,000 two years ago to 30,000 today. Harbath said that more personnel would be assigned to Asia in time for the polls.

Many believe that it is high time for Facebook to impose more restrictions to curb fake news. Malaysia last year passed legislation allowing for punitive action for anyone who spreads fake news, with Singapore looking into a similar law. In Myanmar, the United Nations said that Facebook has been instrumental in spreading sentiments that are against the Rohingya, and in Sri Lanka, it is believed that the social media giant did not prevent the hate speech that caused anti-Muslim rioting.

The new policies of the company arose in part due to occurrences in Asia. Violence in Myanmar gave rise to the new policy for removing misinformation that could bring about “offline harm.” Events in Asia also caused the company to update its credible violence” policy concerning the risks faced by specific groups, which include journalists.

See also  Ho Ching apologises for sparking backlash against woman who was not allowed to board Scoot flight

In India, meanwhile, changes to the company’s policy on hate speech were made based on abuse “on and off the platform.” According to the company, “We updated our policy to add ‘caste’ to our list of protected characteristics, and we now remove any attack against an individual or group based on their caste’.”

However, Ressa maintains that this may not be enough. “The people who created Facebook, the people saying that they want to protect free speech have never been in parts of the world where you cannot speak because you’re afraid. In the global south, people can die for what they say.”

But she is partnering with the social media platform to address these issues. “I am really critical because I have felt the misuse personally. But I am seeing them fix it, and I am helping them fix it because what they have created is incredibly powerful. Facebook is a huge, growing organization that is still learning. We’re dealing with some really smart people – and they have been put on notice.”

Read related: Facebook takedown in Myanmar – cracking down on hate speech posts and pages linked to the military

https://theindependent.sg.sg/facebook-takedown-in-myanmar-cracking-down-on-hate-speech-posts-and-pages-linked-to-the-military/