by Glenn CHAPMAN

A senior Facebook executive on Tuesday said the world’s biggest social network unintentionally helped put Donald Trump in the White House but warned against dramatic rule changes.

The Trump campaign did effectively use Facebook to rally support for his presidential run, and the social network should be mindful of that without making moves that stifle free political discourse, Andrew Bosworth said in a lengthy post on his personal Facebook page triggered by The New York Times publishing an internal memo he wrote.

“So was Facebook responsible for Donald Trump getting elected?” Bosworth asked.

“I think the answer is yes, but not for the reasons anyone thinks.”

Bosworth contended Trump was not elected because of Russia or misinformation or Cambridge Analytica, but rather because he ran “the single best digital ad campaign I’ve ever seen from any advertiser.”

Since Facebook has the same ad policies in place now, the outcome of the 2020 election could be the same as it was four years ago, he added.

Facebook has maintained a hands-off policy on political ads, in contrast with Google which in November placed restrictions on how advertisers can target specific groups of voters.

See also  69-year-old South Korean protests outside Capella hotel; demands that North Korea return her father

“As tempting as it is to use the tools available to us to change the outcome, I am confident we must never do that or we will become that which we fear,” Bosworth wrote.

That doesn’t mean Facebook should not draw a line when it comes to how it is used, he reasoned. Clearly inciting violence, thwarting voting, and other blatant transgressions should be banned, but voters should be trusted to decide what kind of leaders they want to elect, according to Bosworth.

“If we don’t want hate-mongering politicians then we must not elect them,” Bosworth wrote.

“If we change the outcomes without winning the minds of the people who will be ruled then we have a democracy in name only. If we limit what information people have access to and what they can say then we have no democracy at all.”

War rooms
Bosworth’s comments came with Facebook under pressure to better protect user data and prevent its services from being used to spread misinformation, exacerbate social divides and sway political opinions as was the case in 2016 in the US.

See also  Can social media, loud and inclusive, fix world politics?

Keeping the social network secure while thwarting misinformation and fending off competition with new features were among priorities laid out by executives at the Consumer Electronics Show in Las Vegas Tuesday.

“The innovation piece is important to us while we keep people in the company focused on security,” said Facebook vice president of global marketing solutions Carolyn Everson.

Facebook provided visitors a look at a revamped “Privacy Checkup” tool for users which is rolling out this week.

“What is top of mind for me is regulation and how the privacy landscape is developing,” Everson said.

“We would like help on the regulatory front for privacy and security.”

Facebook priorities this year include preventing the platform from being used by malevolent actors to influence the US election, according to Everson.

The social network is in nearly 200 countries around the world, where scores of elections take place annually, and will apply lessons learned through experience to the US, Everson said.

See also  Democrat leader Nancy Pelosi draws flak for describing 9/11 terror attacks as 'incident'

Facebook will once again have a “war room” to coordinate responses to election or voter manipulation efforts by state actors or others in real time.

“The war room model has been working around the world,” Everson said.

“We have 70 to 90 elections each year, so we have been getting better. War rooms are part of our strategy.”

Facebook will ban hyper-realistic deepfake videos ahead of the US election but will still allow heavily edited clips so long as they are parody or satire.

Everson re-affirmed Facebook will stick with its controversial policy of allowing politicians to post information proven to be false.

“We do not believe we are in the position to be the arbiter of truth, but we have been clear that we are continuing to evaluate how we can do it better,” Everson said.

“We don’t want people to mislead on our platform.”

Facebook last month took down a network of accounts it said was using fake identities while spreading pro-Trump messages at the social network and its Instagram service.

© Agence France-Presse

ByAFP