Photos underage girls share on their social media are being faked to appear nude by a deepfake bot on messaging app Telegram, a new report has found.
The disturbing images are created using a simple piece of AI that can virtually remove clothes, according to report authors Sensity.
More than 100,000 non-consensual sexual images of 10,000 women and girls have been shared online that were created using the bot between July 2019 and 2020.
The majority of the victims were private individuals with photos taken from social media - all were women and some looked 'visibly underage', Sensity said.
Sensity says what makes this bot particularly scary is how easy it is to use as it just requires the user to upload an image of a girl, click a few buttons and it then uses its 'neural network' to determine what would be under the clothes and produce a nude
This form of 'deepfake porn' isn't new, the technology behind this bot is suspected to be based on a tool produced last year called DeepNude.
DEEPFAKES USE AI TO CREATE MANIPULATED MEDIA CONTENT
Deepfakes are so named because they are made using deep learning, a form of artificial intelligence, to create fake videos and images.
They are made by feeding a computer an algorithm, or set of instructions, as well as lots of images and audio of the target person.
The computer program then learns how to mimic the person's facial expressions, mannerisms, voice and inflections.
With enough video and audio of someone, you can combine a fake video of a person with fake audio and get them to say anything you want.
At a simpler level it can also be used to remove clothing from a photo of a person fully dressed or make someone appear to be in a place they shouldn't.
It has been described as 'photoshop on steroids' by experts.
<!- - ad: https://mads.dailymail.co.uk/v8/us/sciencetech/none/article/other/mpu_factbox.html?id=mpu_factbox_1 - ->AdvertisementThe artificial intelligence service was launched online and was relatively complicated to use, but allowed people to upload a photo of a woman and the AI would determine what that image would look like if the clothes were removed.
It was removed from the internet within 24 hours, but Sensity suspect that this new bot is based on a cracked version of that technology.
Sensity says what makes this bot particularly scary is how easy it is to use as it just requires the user to upload an image of a girl, click a few buttons and it then uses its 'neural network' to determine what would be under the clothes and produce a nude.
'The innovation here is not necessarily the AI in any form,' Giorgio Patrini, CEO of deepfake-research company Sensity and coauthor of the report told CNet.
'It's just the fact that it can reach a lot of people, and very easily.'
Deep fakes are computer-generated and often very realistic images and videos that are produced from a real world template. They've been used to manipulate elections, for pornography and to promote misinformation.
Patrini says this new move, to use photos of private individuals and 'fake them' to appear nude, is relatively new and puts anyone with a social media account at risk.
The bot, which hasn't been named, runs on the Telegram private messaging platform, which heavily promotes the idea of free speech.
Bot administrator, known as 'P', told the BBC the service was purely for entertainment and that it 'does not carry violence'.
'No one will blackmail anyone with this, since the quality is unrealistic,' adding that any underage images are removed and the user blocked for good.
RELATED ARTICLES
- Disturbing app can create nude images of ANY woman from a... Deepfake detection: Microsoft unveils a tool that can tell... Facebook removes hundreds of accounts run by Romanian troll... Beware the robot cat (flap) burglars! Criminals could use AI...
Share this article
Share 393 sharesThe bot network, where the images are produced and shared, was found to have over 100,000 members, mostly based in Russia or Eastern Europe, Sensity found.
About 70 per cent of all of the images used in the app came from social media or private sources - such as pictures of friends or people the users know.
'As soon as you share images or videos of yourself and maybe you're not so conscious about the privacy of this content, who can see it, who can steal it, who can download it without you knowing, that actually opens the possibility of you being attacked,' Patrini told Buzzfeed News.
The bot was primarily advertised on the Russian social networking service VK. The social platform said it doesn't tolerate that content and removes it when found.
Users upload a photo from social media of a woman or girl, the bot uses artificial intelligence to determine what it 'could' look like under the clothes and creates a fake nude
Users of the service are primarily based in Russia and Eastern Europe and have shared more than 100,000 images of over 10,000 women and girls since July 2019
'Many of these websites or apps do not hide or operate underground, because they are not strictly outlawed,' Patrini told the BBC.
'Until that happens, I am afraid it will only get worse.'
The authors also expressed concern that as the deep fake technology improves, these sort of bots could be used for the extortion of women.
While most deep fakes until now have focused on celebrities or politicians, the users of this app seem more interested in pictures of people they know.
A survey of bot users by Sensity found that 63 per cent were using it to get an idea of what women they know look like without clothes on.
It is more common for celebrities and politicians to be targeted with deepfakes. During the last UK general election videos were produced that appeared to show Jeremy Corbyn (left) and Boris Johnson (right) endorsing one another
Report authors have shared their findings with law enforcement agencies, VK and Telegram but haven't had any response to their concerns.
'Our legal systems are not fit for purpose on this issue,' said Nina Shick, author of Deep Fakes and the Infocalypse when speaking to the BBC.
'Society is changing quicker than we can imagine due to these exponential technological advances, and we as a society haven't decided how to regulate this.
'It's devastating, for victims of fake porn. It can completely upend their life because they feel violated and humiliated.'
The full report is available from Sensity.
ncG1vNJzZmivp6x7pa3IpbCmmZmhe6S7ja6iaKuTnrKvr8StnJygX5a%2FtbXCpZxmcGhrgHN%2Fkmh7oqukqr%2Bjtc2gZJ2dlaWzorfEZquop5xivbC81KWYq2WdmsC0rcaipaBlkaW9bqDEpZygqpGieqe70aCgp59dg6KFkYyipJqflah6trrDnqman5VitKq%2By6xloaydoQ%3D%3D