Meta’s Oversight Board says deepfake policies need update and response to explicit image fell short

by admin

Meta’s oversight panel thinks the company needs to update its rules on non-consensual deepfake images. The panel looked at AI-generated explicit pictures of two famous women. They said Meta’s policy wording isn’t clear enough. They’re pushing for changes to better tackle the rising issue of deepfake content.

In recent years, deepfake tech has become easier to use leading to more explicit, AI-made images of women and celebrities, including Taylor Swift. These fake images often nude, have spread on social media raising worries about privacy and exploitation. Because of this online platforms, like Meta, face growing pressure to tackle this problem.

The oversight group, which Meta set up in 2020 to watch over content rules on platforms like Facebook and Instagram, spent months looking at two specific cases of AI-made images showing well-known women—one Indian and one American. While Meta didn’t name the women, it called them “female public figures.”

In one case, an AI-altered picture on Instagram showed a naked Indian woman. You could see her face, and her body was visible from behind. The image looked a lot like a well-known female celebrity. Meta stepped in by shutting down the account that shared the picture. They also put the image in a database. This database helps to spot and take down content that breaks their rules .

The oversight board determined that the two images broke Meta’s rule against “derogatory sexualized photoshop,” which is part of its bullying and harassment guidelines. But the board noted that the current wording of this policy was too unclear for users to understand. Specifically, it advised changing the word “derogatory” to “non-consensual” to better describe how these altered images cause harm. What’s more, the panel proposed making it clear that the policy covers many types of editing and media manipulation methods, not just “photoshop” by itself.

The board also stressed that deepfake nude images should fall under Meta’s community standards for “adult sexual exploitation” instead of “bullying and harassment.” This change would make sure the platform’s policies better tackle the gravity of these images and how they affect victims.

Meta has shown its backing for the board’s suggestions and is now looking them over. This step shows that social media platforms need to keep updating their policies to match the new problems that AI-generated content brings and to protect people from harm more .

Related Articles

Leave a Comment