A training document used by Facebook content moderators raises questions about whether the social network does not report images of potential child sexual abuse.
According to The New York Times, the moderators challenged this practice imposed by Facebook, but the company’s directors defended it. The problem is how Facebook moderators should handle images where the age of the subject is not immediately apparent. This decision could have significant implications, as the images of suspected child abuse are reported to the National Center for Missing and Exploited Children (NCMEC), which sends the images to law enforcement. Adult images, on the other hand, can be removed from Facebook if they break the rules, but are not reported to external authorities.
Facebook makes big mistakes
But, as the NYT points out, there is no reliable way to determine age based on a photograph. Moderators are instructed to use a 50-year-old method to identify the “progressive stages of puberty,” but the methodology “was not designed to determine one’s age.” And because Facebook rules instruct moderators to assume that photos that they are not sure are adults, moderators suspect that many images of children could escape.
This is further complicated by the fact that Facebook contract moderators, who work for outside companies and do not receive the same benefits as full-time employees, may only have a few seconds to make a decision and may be penalized for mistakes.
Facebook reports more child sexual abuse material to NCMEC than any other company. Technology companies are legally required to report “apparent” child sexual abuse material, but “apparently” is not defined by law. The law on stored communications, a law on privacy, protects companies from liability before the law when reporting, but it is unclear whether the law would protect Meta from misreporting an image. Washington lawmakers need to set a “clear and consistent standard” for everyone to follow.