WSFM 101.7
Pure Gold

Now Playing:

Loading...
Listen on

Facebook Defends Live Stream Of Christchurch Shooting

Facebook has released new details about how they dealt with the live streamed video of the New Zealand mosque shooting that left fifty people dead last Friday.

The social media giant has said that roughly 200 people watched the Christchurch footage live but none of them reported it to moderators.

The live broadcast from the gunman went for 17-minutes but the first user report wasn’t made until 12 minutes after it ended, highlighting the challenge social media companies face with policing real time footage.

After being notified by police, Facebook removed the video “within minutes” said Facebook’s deputy general counsel, Chris Sonderby.

“No user reported the video during the live broadcast”, said Sonderby before revealing that the footage was watched about 4,000 times in total before being taken down.

“We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people.”

Facebook previously said that in the 24 hours following the mass shooting, 1.5 million videos of the attack were removed and of these “over 1.2 million were blocked at upload”, meaning 300,000 copies were successfully shared on the site.

Since the tragedy, many people have questioned why Facebook in particular wasn’t able to detect the video more quickly.

On Tuesday, New Zealand Prime Minister Jacinda Ardern expressed frustration that the footage could still be found online four days after the incident and said she had spoken with Facebook’s Chief Operating Officer Sheryl Sandberg on the issue.

“It is horrendous and while they’ve given us those assurances, ultimately the responsibility does with with them.”

Facebook uses artificial intelligence and machine learning to detect violent and disturbing content that violates it’s standards but also heavily relies on uses to flag content.

During a live stream, users can report a video by answering a series of questions about the type of content and are also told to contact law enforcement if someone is in immediate danger.

Before Facebook became aware of the video, Sonderby said a user on 8chan had already posted a link to copy of it on a file sharing site, which is how it became so wide spread.

Facebook’s former chief security officer, Alex Stamos said in a series of tweets one day after the shooting that tech companies are facing many challenges when it comes to controlling content.

“Each time this happens, the companies have to spot it and create a new fingerprint,” he said.

“What you are seeing on the major platforms is the water leaking around thousands of fingers poked in a dam.”

Stamos estimated that larger tech companies are blocking more than 99 percent of the videos from being uploaded but this “is not enough to make it impossible to find.”

Share this: