Facebook receives flak for failing to block 300,000 uploads of the New Zealand terrorist attack live stream video

The 300,000 videos that were successfully uploaded and shared shows a 20% failure on Facebook’s efficiency to detect such content

Photo: Facebook screengrab

FBNewsroom – The statistics are out showing just how powerful social media can be; yet with that kind of power, an equal amount of responsibility must be taken.

On March 17 (Sunday), Facebook New Zealand’s Mia Garlick gave an update via Facebook Newsroom’s Twitter account stating that in 24 hours, 1.5 million videos of the New Zealand live stream were taken down.

The 28-year-old shooter, Brenton Tarrant, used a head-mounted camera to live stream his actions on Facebook, banking on the most-used social media platform to spread his terrorist attack.

Read about the mass shooting below:

According to Facebook Newsroom’s Twitter updates, they closed the attacker’s account within an hour of the attack; however, it gave the online community 60 minutes to mass produce the content not just in Facebook but in Twitter and YouTube as well.

This was their Twitter update: “In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload…”

Photo: Facebook screengrab

The 300,000 videos that were successfully uploaded and shared shows a 20% failure on Facebook’s efficiency to detect such content.

Tech industry publishing outlet TechCrunch noted that they found several videos still available on Facebook more than 12 hours after the massacre.

Netizens are alarmed at the statistics provided by Facebook.

People like Mark Rickerby are calling the numbers to be “vanity metrics” that do not clearly prove if merely blocking the video was effective in truly addressing the issue.

He wrote: “These numbers are vanity metrics unless they’re shown alongside engagement and video views from the live stream. This is bordering on misinformation which gives us no way of understanding the true scale of distribution and amplification.”

Photo: Facebook screengrab

Craig Silverman also asked for more than just the 300,000 number that was announced. He said, “Thanks for sharing stats. Can you say how many views, shares, reactions, and comments the 300k that made it onto your platform generated? And what was the average length of time that these copies of the video were online before being removed?”

Photo: Facebook screengrab

Netizen Ciara Mitchell gave alarming news of how she tried to report the video when it surfaced in her newsfeed but got a message that the content didn’t violate any guidelines.

Photo: Facebook screengrab

Read the other comments below:

Photo: Facebook screengrab

Not to mention the capacity of social media users in sharing information. One can take a video of the live stream on their phones and share it on other platforms such as WhatsApp or WeChat. The torrent of the video is still accessible for download in sites such as PirateBay.

Facebook, which was the source of the spread, will have much tweaking to do in its system to ensure the public’s safety.

The social media giant is also removing all edited versions of the video, according to Ms Garlick. “Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content.”

Photo: Facebook screengrab

Read the FB Newsroom thread below:

Facebook’s “serious mistake” addressed by Singapore

On November last year, Facebook was also grilled by Senior Minister of State for Law and Health Edwin Tong for failing to remove a post that sparked racial hatred in Sri Lanka.

Facebook’s vice-president of policy solutions admitted that “We make mistakes … serious mistakes; our responsibility is to reduce the number of mistakes.”

Read more below:

Facebook admits its ‘serious mistake’ after Edwin Tong questions their inaction with regards to Sri Lankan hate-post