Facebook removed sexual photos

SAN FRANCISCO: Facebook Inc said on Wednesday that during the previous quarter, the company moderator first removed 8.7 million user images of child nudity with the help of previously unknown software, which automatically flags such photographs.

The Machine Learning Tool, which starts from last year, identifies images that include both nudity and child, which allows the promotion of Facebook’s ban on photos showing minors in sexual relations.

On Wednesday, similarly the system caught users engaged in “beauty” for sexual abuse, or friendly to minors.

Antigen Davis, global head of Facebook security, told Reuters in an interview that “the machine helps us prioritize” and “more efficiently queue” problematic content for the trained team of the reviewer’s company.

The company is looking to implement the same technology in its Instagram app.

Under the pressure of regulators and lawmakers, Facebook has vowed to give the speed of removal of extremist and illegal material. Machine learning programs that post content every day through billions of pieces of content are essential for their plans.

Learning machine is incomplete, and news agencies and advertisers are among those who complained this year that the automated systems of Facebook are blocking their posts incorrectly.

Davis said that the child protection system will make mistakes but users can appeal.

“We will make mistakes in favor of caution with children,” she said.

For years, Facebook’s rules have also banned the family photographs of children wearing light clothes uploaded with “good intentions”, worrying about how other people can abuse such images.

Prior to the new software, Facebook relied on users or its adult nudity filter to capture images of children. A separate system blocks child obscenity, which was earlier informed to the authorities.

Facebook has not disclosed data on the first removal of child nudity, although some people were counted as one of 21 million posts and comments in the first quarter for sexual activity and adult nudity.

Facebook shares fell 5 percent on Wednesday.

Facebook said that the program, which has learned from the collection of pictures of naked adult photographs and children wearing clothes, has more clearance. This creates an exception for art and history, such as the Pulitzer Prize-winning photo of the naked girl escaping from the Vietnam War dislikes.

Davis said that the child beauty system evaluates factors such as how many people have blocked a particular user and whether the user tries to contact many children.

Michelle DeLouen, Chief Operating Officer of the National Center for Missing and Exploited Children (NCMEEC), said that the organization hopes to achieve 16 million child pornography tips from more than 10 million Facebook and other technology companies around the world this year.

With the hike, NCMEEC said that it is working with Facebook to develop software to decide which strategies to assess.

Nevertheless, DeLounen acknowledged that a significant dark location is the encrypted chat app and the secret “dark web” sites where many new child pornography arises.

Encryption of messages on Facebook-owned WhatsApp, for example, preventing them from analyzing the machine learning.

Delaware said that NCMEEC would educate tech companies and “hopefully use” creativity to solve this issue. ”

Facebook removed sexual photos

Leave a Reply