The macabre video of a mass shooting at a mosque in New Zealand on Friday is still online on many tech platforms in India, as curiosity and anti-Muslim bigotry ensure the video is uploaded, despite social media companies pledging to remove it.
BOOM's WhatsApp helpline received several messages from readers asking us if the video was true. We were also able to find the video on Facebook, YouTube and Twitter, 24 hours after the attack.
Tech platforms have come under fire for unwittingly ensuring live footage of the shooting spree at the Al Noor mosque in Christchurch, New Zealand on March 15, 2019, went viral globally.
According to news reports, 17 minutes of the attack was live-streamed on Facebook before the platform pulled it down and suspended the Facebook and Instagram accounts of the shooter -- 28 year old Brenton Harrison Tarrant.
Tarrant, an Australian citizen by birth, used LIVE 4, a mobile phone to stream directly to Facebook from a camera that he strapped to himself giving the vantage point a gunman.
Many mistook the shocking visuals for a video game.
Copies of the video soon went viral on Facebook, Twitter, YouTube and messaging app WhatsApp.
49 people were killed and dozens wounded in mass shootings at two mosques in Christchurch, 41 one of those were killed at the Al Noor mosque alone. The country's Prime Minister Jacinda Ardern called it a terrorist attack.
New Zealand police on Friday urged people to refrain from sharing the video and said they were working to have it removed.
Police are aware there is extremely distressing footage relating to the incident in Christchurch circulating online. We would strongly urge that the link not be shared. We are working to have any footage removed.— New Zealand Police (@nzpolice) March 15, 2019
In its initial response, Facebook said it was alerted by local police about the livestream after which it acted.
Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video. We're also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.— Facebook Newsroom (@fbnewsroom) March 15, 2019
Alphabet Inc's video hosting site YouTube said it was working vigilantly to remove any violent footage on its platform.
Our hearts are broken over today’s terrible tragedy in New Zealand. Please know we are working vigilantly to remove any violent footage.— YouTube (@YouTube) March 15, 2019
On Twitter, however, the video was easily searchable with words such as "mosque attack" and "new zealand mosque attack"
It appears removing the videos, which many fear could either inspire copycat attacks or reprisals, are turning out to be a game of whack-a-mole for tech companies as moderation requires human intervention. On most platforms only once a video is flagged can it be removed.
BOOM also found posts where it was used to fuel anti-Muslim bigotry.
A forward that we received on its helpline falsely claimed that the attacker was a Muslim and could not distinguish between Islam and other faiths.
We also found a Facebook user who uploaded the video with a caption that said 'New Zealand wants freedom from Islam'. The post had 66 shares at the time of writing the article.
We also found the video on Twitter with a similar anti Islam tweet.