Personal Finance Global Technology

In Cases of Doubt, Facebook Admins “Err on the Side of an Adult”

In Cases of Doubt, Facebook Admins Err on the Side of an Adult

A major responsibility for tech companies is to monitor content on their platforms for child sexual abuse material (CSAM), and if any is found, they are legally required to report it to the National Center for Missing and Exploited Children (NCMEC). Many companies have content moderators in place that review content flagged for potentially being CSAM, and they determine whether the content should be reported to the NCMEC.

However, Facebook has a policy that could mean it is underreporting child sexual abuse content, according to a new report from The New York Times. A Facebook training document directs content moderators to “err on the side of an adult” when they don’t know someone’s age in a photo or video that’s suspected to be CSAM, the report said.

The policy was made for Facebook content moderators working at Accenture and is discussed in a California Law Review article from August:

Here is the company’s reasoning for the policy, from The New York Times:

When reached for comment, Facebook (which is now under the Meta corporate umbrella) pointed to Davis’ quotes in the NYT. Accenture didn’t immediately reply to a request for comment. Accenture declined to comment to The New York Times.