Home Technology 24 Million People Engage With Websites Using AI To Digitally Undress Women In Photos: Study

24 Million People Engage With Websites Using AI To Digitally Undress Women In Photos: Study

0
24 Million People Engage With Websites Using AI To Digitally Undress Women In Photos: Study

[ad_1]

As many as 24 million people are using websites and apps that enable them to digitally undress women in photos using AI tools.



Updated: December 9, 2023 2:16 PM IST


By Joy Pillai

deepfake
24 Million People Engage With Websites Using AI To Digitally Undress Women In Photos.| Photo: (representative Image) Pixabay

Artificial Intelligence has become an instant hit since its launch, assisting everyone from students learning new languages to large corporate companies utilising AI tools for complex research. Nowadays, several AI tools are available online through which people can ease their work. However, these AI tools are also exploited by fraudsters for committing online crimes. A recent study has highlighted another kind of crime involving the use of AI-powered apps and websites. Researchers and privacy advocates are alarmed by the growing trend of AI powered apps and websites to digitally undress women in photos, revealing significant concerns, as reported by Bloomberg.

A shocking report from social network analysis company Graphika indicates that as many as 24 million people visited these undressing websites in most of September alone. This has raised concerns over the surge in non-consensual pornography driven by advancements in artificial intelligence.

The “nudify” services, using popular social media networks for their promotion, have witnessed a surge of over 2,400 percent in links advertising undressing apps on social media platforms such as X (formerly Twitter) and Reddit since the start of the year. These services utilise AI tools to digitally undress individuals, primarily targeting women.

This widespread proliferation poses serious legal and ethical concerns, as the images are sourced from social media without the subject’s consent or awareness.

The surge in these websites extends to potential harassment as some ads clearly suggest users could create nude images and send them to digitally undressed person. In view of the growing threat, Google has updated its policy against sexually explicit content in ads and is aggressively removing violative material.

Privacy experts are worried about the rise of fake, manipulated videos called deepfake pornography due to better AI technology. As per Eva Galperin, who works on cybersecurity, regular people are now using these tools on everyday people, like high school and college students. Many victims might not even know these fake videos exist, and for those who do, it’s hard to get help from the police or take legal action.

Despite massive exploitation of photos of people, there is no law for the crime in the United States to prohibit the creation of deepfake pornography.

In a recent case in North Carolina, a child psychiatrist has been sentenced to 40 years in prison. This marks the first prosecution under a law that prohibits the creation of deepfakes, for producing child sexual abuse material using patient photos.

Taking action against the alarming trend, TikTok and Meta Platforms have bloc keywords associated with these undressing apps.

TikTok warns its users that the term “undress” may be link to the content violating its guidelines. Meta Platforms on the other hand declined to provide further comments on its actions.

With the evolution of technology, the ethical and legal challenges presented by deepfake pornography highlight the pressing necessity for comprehensive regulations. These regulations are crucial to safeguard individuals from the non-consensual and harmful use of AI-generated content.



[ad_2]

LEAVE A REPLY

Please enter your comment!
Please enter your name here