Tech’s dark world: Apps, websites misuse AI for undressing women in photos, gain popularity


The latest trend which has left researchers baffled and has forced people to worry about the growing dangers of artificial intelligence technology in the future is its use by websites and applications for undressing women in photos.

Such applications and websites have been gaining popularity, as per the report published by Bloomberg. According to the information gathered by social network analysis company Graphika, undressing websites were visited by 24 million people in September.

As per Graphika, popular social networks are being used by many of these undressing or “nudify” services for marketing. The researchers said that from the start of the year, the number of links which advertised the undressing apps saw an increase of more than 2,400 per cent on social media, which included Reddit and X.

AI is being used by the services for recreating an image in which the person is nude. Many of these platforms only use AI on images of women.

AI’s deepfake pornography

Such applications have become a part of a worrying trend of non-consensual pornography which is being developed and distributed due to the advancement in artificial intelligence. This can be called as one kind of fabricated media, also known as deepfake pornography.

Such applications are bound to face ethical and legal hurdles as the images are generally picked from social media and distributed without any control, consent or knowledge of the subject.

Graphika said that such applications are becoming popular at a time when many open-source diffusion models are being released and images of superior quality are being created by artificial intelligence.

Watch: Gravitas: Artificial Intelligence can strip you of your dignity

Since they are open source, the models which are being used by the app developers are available for free. “You can create something that actually looks realistic,” said Santiago Lakatos, who works as an analyst at Graphika, while speaking to Bloomberg. He added that earlier deepfakes were often blurry.

Meanwhile, the report stated that one of the apps had made payment for sponsored content on YouTube – owned by Google – for appearing first when the word “nudify” is searched.

Speaking about the same, a Google spokesperson said ads “that contain sexually explicit content are not allowed by the company. We’ve reviewed the ads in question and are removing those that violate our policies.” 

(With inputs from agencies)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *