NZ shooting video was reuploaded to Facebook 300,000 times within 24 hours

LONDON, ENGLAND - MARCH 25: In this photo illustration the Social networking site Facebook is reflected in the eye of a man on March 25, 2009 in London, England. The British government has made proposals which would force Social networking websites such as Facebook to pass on details of users, friends and contacts to help fight terrorism. (Photo by Dan Kitwood/Getty Images)

According to the tech giant Facebook, the livestream video of Christchurch attack was first reported to moderators 29 minutes after the stream began, and 12 minutes after the live feed ended.

No one reported the live-stream of the video until the video had ended, Facebook has claimed.

Facebook’s revelations have raised questions about the company’s policy of community moderation.The delay in taking down the footage meant that it had already begun to circulate on other online platforms.

Brenton Tarrant, charged for murder in relation to the mosque attacks, is seen in the dock during his appearance in the Christchurch District Court, New Zealand March 16, 2019.

Before Facebook was alerted to the footage, the company said “a user on 8chan posted a link to a copy of the video on a file-sharing site”.

The company added that it had removed 1.5 million videos of the attack from its platform in the 24 hours after the deadly shootings – with 1.2 million of them blocked at the point of upload.

This means the violent footage was successfully reuploaded to the platform at least 300,000 times.

In a blog post by the company’s deputy general counsel, Chris Sonderby, the social media giant announced that it was attempting to use a range of technologies to detect when the video or similar videos were being uploaded.

However, on video platform YouTube, clips celebrating the New Zealand mosque shootings are easily avoiding the platform’s moderation efforts, despite a general clampdown across social media platforms.

Yesterday, Sky News identified videos made in support of the killings, including one which recreated the attack in the children’s game Minecraft alongside others that splice the attacker’s comments into other videos.

The technology to automate the detection of particular files can struggle to identify when video and image files have been modified in even a minor fashion.

The development of technology to address the spread of terrorist propaganda is something which online platforms have been investing in for years, but Home Secretary Sajid Javid has said social media companies “really need to do more to stop violent extremism being promoted on [their] platforms”.

Experts have warned that smaller and fringe platforms are being used to coordinate the spread of this material on the mainstream sites.

Jacob Davey, a researcher with the Institute for Strategic Dialogue, told Sky News that the copy of the video shared on 8chan was significant to its spread across the web.

He explained: “8chan is a platform which is intimately connected to the global alt-right movement, as well as being one of the engine rooms of internet culture.

Pistol and bullets laying on table

“By broadcasting to 8chan the attacker was ensuring that he was reaching an audience with whom his messaging would resonate the most.

“This shows a deep and dark awareness of the powerful subculture that the extreme right has built, which revolves around platforms such as 8chan.”

Leave a Reply

Your email address will not be published. Required fields are marked *