It is the viral application of the moment. Lensa allows you to create ‘Magic Avatars’ with AI from your mobile. This is not just an editor, we can upload several photos of a person and create dozens of similar portraits. Our personality has come to be described as a “multiverse”. Playing with the potential of AI is great, but it’s also an excellent and disturbing example of how biased these tools can be.
I want an avatar, not an almost naked woman. The avatars that appear when Lensa is tested by men are warriors, politicians or astronauts. There is everything. However, when the app is tested by a woman, there is a clear bias towards creating avatars of nude models regardless of style. The number is high enough for many users to complain, as they have made it clear that they only want an animated avatar, not an almost pornographic version of themselves.
Melissa Heikkilä, a journalist specializing in Artificial Intelligence for Technology Review, came to do numbers on this phenomenon. Of the 100 avatars created with the She face, 16 were nude and the other 14 wore very tight and suggestive clothing.
The expert explains that it has Asian features, which adds even more to the bias. But we’re talking about 30% of the images produced are overtly hypersexualized. It’s a completely disproportionate number that thankfully doesn’t recur in all women using the app.
“Disturbing and frightening.” It is the emotion left after discovering that most of the avatars produced by Lensa distort your personality and turn it into some kind of sex toy. These two adjectives were chosen by actress Megan Fox when she tried the app. The actress uploaded several avatars created by Lensa in a post on her Instagram, emphasizing that they are all sexual and appear almost naked in them.
The importance of the right data set. We know AI has biases, but these tools are more pronounced as they can be used more broadly. Lensa is based on Stable Diffusion, an open model trained on the LAION-5B dataset with 5,850 million images.
Filters are available to reduce bias, but Lensa does not apply them. Initially this same model was even more biased and almost all images of Asian women were pornographic. It has been reduced by the improvement of databases. Thankfully we know this to be the case thanks to Stable Diffusion, but this cannot be confirmed in other closed models such as OpenAI’s DALL-E.
The new version of Stable Diffusion allows you to filter repetitive images; this reduces the fact that the same type of obscene image is returned when the same concept is introduced, such as Asian women. Unfortunately, Lensa doesn’t implement it yet.
Lensa are animated avatars, but we also have photorealism. This issue isn’t just caused by an avatar app. AI is advancing at a rapid pace. We have an example of Lexica Aperture that can take realistic photos with celebrities; We have a memory of ‘deepfakes’ that are already easy to create in seconds thanks to DeepNude’s and AI. This is an age-old problem that still has no solution, many years later.
“A Lost Cause”. This was described by Scarlett Johansson, one of the actresses most affected by hypersexuality. In 2019, he considered trying to stop “deep frauds” in porn, eventually arguing that there was “an almost lawless online chasm”.
Lensa doesn’t get to that point, but it’s a perfect example that’s currently viral and exemplifies the need to highlight this issue. There will always be bastions on the internet and artificial intelligence to reflect our biases, but we must continue to take steps in the right direction. The availability of AI for all users depends on it.
Image | Lensa (Megan Fox)