Facebook Tests Tools to Combat Child Sexual Abuse
Facebook is testing new tools that are aimed at curbing the searches for photos and videos that contain child sexual abuse and preventing the sharing of such content.
Facebook is now putting in an effort to end child exploitation on its platform. Antigone Davis, global head of global safety at Facebook said, “Using our apps to harm children is unacceptable and abhorrent.”
The move was planned by Facebook as it has been under scrutiny for allowing child sexual abuse to run wildly on its platform.
The first testing tool is a pop-up notice for the users searching content that is related to child sexual abuse. And an alert will warn the users trying to share such content. The notice will ask that users if they want to continue. The notice will include a link to offender diversion organizations and it also says that child sexual abuse is illegal and that viewing these images can lead to consequences like imprisonment.
Facebook found sharing such content was shared as outrage or in poor humor and also for other reasons outside of harming the child. And accounts promoting such content will be removed.
Another tool will stop the spread of such content by informing the users that trying to share abusive content may disable their account. This will be an alert that’ll inform users if they try to share these harmful images, their account may get disabled. Facebook said it’s using this tool to help identify behavioral signals of users who might be at a greater risk of sharing this harmful content. Davis said, “This will allow Facebook to identify harmful content and encourage users not to share it.”
Facebook has also updated its child safety policies. The social media platform said that it will pull down Facebook profiles, Pages, groups, and Instagram accounts that are dedicated to sharing otherwise innocent images of children with captions, hashtags, or comments containing inappropriate commentary about the children depicted in the image. Facebook users who report content will also see an option to let the social network know that the photo or video involves a child, instigating the company to review it.
Facebook faces a lot of pressure to combat this problem amid its plans to enable default encryption on Facebook Messenger and Instagram. The end-to-end encryption means that other than the sender and recipient, messages could not be viewed by Facebook or law enforcement officials. Child safety advocates have raised concerns that the encryption plans could make it very difficult to crack down on child predators. Facebook sparked concerns among child safety law enforcement agencies over its end-to-end encryption plans.
Presently, law enforcement is capable of reading messages shared within Facebook and Instagram, which have helped arrest child offenders. But rolling out end-to-end encryption will allow criminals to hide innocent children their prey.
A Business Insider report said that during the coronavirus pandemic, child sexual abuse images increased online. The time between July to September noticed 13 million harmful images on both Facebook and Instagram. Copies of six videos made up more than half the child abuse content from October to November 2020.
The National Center for Missing and Exploited Children (NCMEC) figures showed a 31 percent increase in the number of images of child sexual abuse in 2020.
NCMEC also said that Facebook with 99 percent was responsible for more child sexual abuse content than any other tech form in 2019.
Davis further said, “Today, we are announcing new tools we are testing to keep people from sharing content that victimizes children and recent improvements we have made to our detection and reporting tools. We found that more than 90 percent of this content was the same as or visually similar to previously reported content.”