Powered by

Home World

How Lensa AI App generates nudes without consent?

Last week Lensa became a trend thanks to the great work that its Artificial Intelligence is doing, capable of transforming any photo

By Ground report
New Update
How Lensa AI App generates nudes without consent?

Last week Lensa became a trend thanks to the great work that its Artificial Intelligence is doing, capable of transforming any photo of a user into one suitable for passing to social networks. Now it has come to the fore as it tends to sexualize the woman's body with that AI.

The problem is that the AI, like everything in this world, has its dark side and it is not the first time that it has gotten into a bogged-down field with the issue of racing. This is what has happened to the journalist Melissa Heikkilä, who has experienced this problem firsthand.

You may be familiar with Lensa, the app of the moment to turn your own face into an AI portrait. Using a photo of your face, create avatars of all styles and arts you can imagine with artificial intelligence. Heikkilä, according to an MIT Technology Review article, used the app to get portraits just as impressive as those of her colleagues. In her case, nudes and sexualized avatars appeared.

And it is that Heikkilä is not only a woman but also has Asian features due to his ancestry. This caused a gender bias to be applied to her results and, in addition, they received a racist overtone. The journalist, she affirms, generated at least 100 avatars and of which, 16 were directly bare-chested and 14 with sexualized poses and provocative clothing.

A journalist has pointed out how when using the application in front of her male colleagues, she has received clearly pgraphic avatars.

Racism and sexualization

It all starts with Magic Avatars and AI filters that turn your face into an anime portrait, which have become very popular on networks like TikTok. This has caused Lensa, an AI portrait generation app released in 2018, to skyrocket in popularity. The use that Heikkilä gave to the application gave completely devastating results.

She claims that she used the app for her writing partners, and in virtually all of them the app generated avatars of a male characters, with high-profile professions such as astronauts, warriors, and hit record covers. Heikkilä got just the opposite; she sexualized poses, skimpy clothing, and nothing that would be considered glorious at all.

But her Asian ancestry also influenced the portraits generated. Anime or video game characters, as well as pgraphy or fetishes related to Asian women, concentrated a large part of the results. In general, almost all of the avatars that Heikkilä generated were either nude or showed a lot of skin. The least harmful, directly, appeared in tragic situations, even crying.

Heikkilä tried one of her partners, a Caucasian, and while she did get nudity and sexualized images, the amount was less.

Pgraphy and rape-related material

Heikkilä cites researchers Abeba Birhane, Vinay Uday Prabhu and Emmanuel Kahembwe, who discovered that LAION-5B hosted not only racist images but also a large amount of pgraphy and even illegal rape-related material. And they got access to this suite precisely because LAION-5B is open source; other models are far from open and are based on data sets that also come from information on the Internet.

The journalist already collected last September how the first version of Stable Diffusion used a data set that suffered from these problems. Despite the fact that this version was supposedly equipped with a security filter, Lensa has used one that does not take advantage of these measures.

In November, a version was released that removed images that followed certain repetitive patterns. And it is that the greater the images that perpetuated these stereotypes, the more they appeared in the model. This is the case of the union of the Asian race with sexual stereotypes. In short, experts consulted by Heikkilä affirm that women receive associations with sexual content and men, with positions of power and the like.

ALSO READ

You can connect with Ground Report on FacebookTwitterKoo AppInstagram, and Whatsapp and Subscribe to our YouTube channel. For suggestions and writeups mail us at [email protected]