New Delhi: The deepfake menace and AI misuse has long been a major concern for authorities as well as people across the world and now an investigation has revealed that AI-powered chatbots on Telegram are being used to create nude and explicit images of real people by millions of users.
Using these bots, people can morph photos with just a few clicks and can create deepfakes that remove clothing or fabricate sexual activity. A recent report by Wired said that as many as 4 million users utilise these chatbots every month to create deepfakes. These bots pose serious risks to people, especially to young girls and women.
Henry Ajder, who discovered these explicit chatbots on Telegram four years ago, termed the issue "really concerning" given the ease of access and the large number of people actively using them.
"It is really concerning that these tools — which are really ruining lives and creating a very nightmarish scenario primarily for young girls and for women — are still so easy to access and to find on the surface web, on one of the biggest apps in the world," he said.
Several celebrities, including Taylor Swift, Jenna Ortega, Alia Bhatt, Rashmika Mandanna, also fell victim to these deepfakes and the recent investigation is even more concerning as teenage girls are being targeted. These deepfakes are being used for sextortion.
Related News |
Deepfakes Of Young Girls Leave Residents Of Spain's Almendralejo In Shock
Millions Of People Using Bots To Create Deepfakes
“These types of fake images can harm a person’s health and well-being by causing psychological trauma and feelings of humiliation, fear, embarrassment, and shame,” says Emma Pickering, the head of technology-facilitated abuse and economic empowerment at Refuge, the UK’s largest domestic abuse organization. “While this form of abuse is common, perpetrators are rarely held to account, and we know this type of abuse is becoming increasingly common in intimate partner relationships.”
WIRED claimed that its review of Telegram communities involved with the explicit nonconsensual content has identified at least 50 bots that claim to create explicit photos or videos of people with only a couple of clicks. The bots vary in capabilities, with many suggesting they can “remove clothes” from photos while others claim to create images depicting people in various sexual acts.
Related News |
Creating Sexually Explicit Deepfakes To Become A Criminal Offence In UK | Here's What New Law Says
The 50 bots list more than 4 million “monthly users” combined, according to WIRED's review of the statistics presented by each bot. Two bots listed more than 400,000 monthly users each, while another 14 listed more than 100,000 members each.
"The Telegram bots identified by WIRED are supported by at least 25 associated Telegram channels—where people can subscribe to newsfeed-style updates—that have more than 3 million combined members. The Telegram channels alert people about new features provided by the bots and special offers on “tokens” that can be purchased to operate them, and often act as places where people using the bots can find links to new ones if they are removed by Telegram," the report said.
So many people are trying to use “nudify” websites that Russian cybercriminals, as reported by 404Media, have started creating fake websites to infect people with malware.