Facebook removed 8.7 million sexual photos of children in last three months

Facebook removed 8.7 million user images of child nudity with the aid of software that automatically flags such photographs, the company disclosed on Wednesday.

The company's machine learning tool can identify images that contain both nudity and a child, which beefs up the social network's ban on photos showing minors in a sexual context.

"We're using artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it's uploaded," Antigone Davis, Facebook's global head of safety, said in a blog post.

"We're using this and other technology to more quickly identify this content and report it to [the National Center for Missing and Exploited Children], and also to find accounts that engage in potentially inappropriate interactions with children on Facebook so that we can remove them and prevent additional harm," Davis added.

See the Full Story at FoxNews.com