New Zealand mosque attack videos are still being hosted on Facebook

Facebook says it is still working to detect and remove edited versions of a live stream recorded by the alleged gunman.

An injured person is loaded into an ambulance following a shooting at the Al Noor mosque in Christchurch
Image: Fifty-one people died in the attacks in Christchurch
Why you can trust Sky News

Videos of the New Zealand mosque attacks that left 51 people dead are still being hosted on Facebook six months after the tragedy.

Despite recent efforts to crack down on terrorist-related content, including new tech to help identify and remove it, more than a dozen clips from the mass shootings on 15 March have been seen by NBC News.

The footage was taken from a live stream recorded by accused gunman Brenton Tarrant, which was broadcast on the social network as the attacks on two mosques in Christchurch were carried out.

People wait outside a mosque in central Christchurch, New Zealand
Image: Two mosques were targeted by a gunman

The clips show a first-person view of Tarrant, 28, using an assault weapon to shoot dozens of worshippers.

Many of the videos, which include edited sections and screen recordings of the original footage, had been on the platform since the week of the incident.

Some of them had been automatically covered by Facebook with a message warning users they feature "violent or graphic content" - but they had not been deleted.

Facebook has acknowledged that footage from the Christchurch attacks has remained online in the months since, but said it had removed and "hashed" two versions of the original video identified by NBC News.

More on Facebook

Residents pay their respects for the victims of the mosques attacks in Christchurch on March 16, 2019
Image: New Zealand held a period of national mourning in the wake of the deadly attacks

When videos are "hashed", it means that other subsequent clips that are visually similar are detected and automatically removed from both Facebook and Instagram.

However, there are instances when videos are so heavily edited that the tech can struggle to identify them as different versions of the same offending content.

NBC News discovered the videos via internet security researcher Eric Feinberg, who said his software had detected hundreds of versions of the original live stream on Facebook and Instagram over the last six months.

Mr Feinberg, founder of cyber-intelligence company Gipec, questioned the competence of the Facebook tech used to identify terrorist content.

He said: "It's literally the same footage. The guy walks in with the music, the gun, the angle of the gun, the shots.

"What are they doing with their technology? If with all of their great AI tools they can't take down content like this that is consistent, what makes you think they can take down new violent content?"

Facebook
Image: Facebook says it is still working to detect and remove versions of the original live stream

In a statement, a Facebook spokesman told Sky News: "We continue to automatically detect and prevent new uploads of this content on our platforms, using a database of more than 900 visually unique versions of this video.

"When we identify isolated instances of newly edited versions of the video being uploaded, we take it down and add it to our database to prevent future uploads of the same version being shared."

Facebook said it was improving its ability to combat "manipulated media" through a partnership with several US universities, allowing easier detection of such content.

Last week, the company announced it was also working with the Metropolitan Police to prevent the live-streaming of any future terror attacks.

Starting in October, the London force will provide Facebook with footage of its firearms training, taken from the body cameras worn by officers.

New Zealand Prime Minister Jacinda Ardern speaks to the media during her post cabinet press conference at Parliament in Wellington on March 25, 2019
Image: New Zealand Prime Minister Jacinda Ardern will meet tech leaders to discuss combating extremist content online

It will be used to train artificial intelligence systems that the social media giant says will be able to detect and automatically remove live-streamed firearms attacks.

Facebook will also be among the companies present at the Christchurch Call To Action in New York on Monday, where tech leaders will meet with government representatives to discuss how to tackle extremist content online.

The event was initiated by New Zealand Prime Minister Jacinda Adern and French President Emmanuel Macron, with both in the city for the United Nations General Assembly.

It is being supported by the likes of Amazon, Google, Microsoft and Twitter.