Here’s Why Facebook Deleted 1.5m videos from New Zealand Terror Attack.

People gather in front of floral tributes at a makeshift memorial for victims of the March 15 mosque attacks, in Christchurch on March 17, 2019. AFP PHOTO

As New Zealand recovers from terrorist attacks against two mosques in Christchurch, Facebook has announced that it deleted 1.5 million videos of the shootings in the first 24 hours following the massacre.

The tech company said in a tweet on Sunday that it prevented 1.2 million videos from being uploaded to its platform, which has more than 2.2 billion global users.

Image result for Why Facebook Deleted videos from New Zealand's mosque shooting

“In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload,” Facebook said in a tweet.

The disturbing videos with some running for nearly 17 minutes and purportedly showing the gunman walking into a mosque and opening fire that left at least 49 people dead and many more injured were pulled down.

However, it implies 300,000 versions of the video were available to watch for at least short periods of time before Facebook removed them. It also reveals how quickly such provocative and graphic content circulate online and the challenges facing social media companies such as Facebook as they try to stamp them out.

Image result for Why Facebook Deleted videos from New Zealand's mosque shooting

GRAPHIC CONTENT

The company said it is also removing all edited versions of the video that do not show graphic content out of respect for the people affected by the mosque shooting and the concerns of local authorities.

Its is against media laws for any online platform and social media sides to run provocative and explicit graphic videos and photos as this will cause disturbances among st its audiences. A petition can be launched against the site by the affected quarters and this is the main reason why Facebook acted swiftly.

Image result for Why Facebook Deleted videos from New Zealand's mosque shooting

The victims’ names have not yet been made public. While a preliminary list of the victims has been shared with families, New Zealand police said their bodies have not yet been released.

Video of the brutal attack was livestreamed on Facebook by the suspected gunman Brenton Tarrant, an Australian Natve who appeared in court this weekend and has been charged with murder. Tarrant is likely to face more charges when he goes in front of the Christchurch high court April 5.

Image result for Why Facebook Deleted videos from New Zealand's mosque shooting

Video of the attack showed the gunman taking aim with assault-style rifles painted with symbols and quotes used widely by the white supremacist movement online.

An Online manifesto spewed a message of hate replete with references familiar to extremist chat rooms and internet trolls. 

The number of wounded also increased to 50, he said.

Of those, 34 remain hospitalized in the Christchurch Hospital and 12 are in critical condition, Greg Robertson, the hospital’s chief of surgery, said.

Leave a Reply

Your email address will not be published. Required fields are marked *