Earlier today, the New Zealand Herald has published a letter from Facebook head Sheryl Sandberg on how the company will face the deadly terrorist attack in Christchurch two weeks ago. In the letter, it defines the three steps the company will take, including the limitation for live video.
He described the attack as an "act of pure evil" and that the company was "determined to review what happened" and that it was cooperating with the country's authorities. Following the attack, Facebook says 1,5 million videos were removed from the attack worldwide, while 1.2 million videos were blocked "during upload". Sandberg's letter states that while Facebook moved quickly to remove the perpetrator's video and account, the company could have done more and set three steps it will try to take from now on.
The first step is that Facebook "explores restrictions on who can upload live videos depending on factors such as previous violations of the rules" and that the company places more resources on systems to be able to detect their violent videos, even if they are edited by users. He noted that they have found more than 900 variants of the attack video.
The second step is to take more serious steps to "eliminate hatred on our platforms." Last week, Facebook announced that it was banning white nationalist and divisive content from the site and would redirect people searching for such content to resources that help people quit hate organizations. Sandberg says the company has since removed a number of hate groups.
The final step outlined by Sandberg is that the company provides support to "four local welfare and mental health organizations" in New Zealand and reiterated that the company is ready to work with a committee that plans to consider how social media platforms played any role in the attack.
Sandberg's letter does not provide details on what "exploration of limitations" means in live video, although it does seem to refer to some warnings - such as if a person has previously violated the site community's standards. Earlier this month, the attacker filmed his attacks on two mosques and posted them on Facebook, the YouTube, Twitter and the Instagram with the aim of becoming viral. While the attack video had originally seen less than 200 people, technology companies were wary of letting the videos spread further. Facebook has again faced problems with violent video guards such as 2017 when a Cleveland resident uploaded videos of a murder on his profile where he stayed there for several hours.