Sick sexualised 'face swap' apps being targeted at children as young as nine

July 2024 · 6 minute read

VIDEO apps that can instantly create sexualised fake likenesses are being targeted at children as young as nine, a Sun on Sunday investigation has found.

Youngsters simply input a photo of a boy or girl’s face and within seconds it is transposed on to a scantily clad body in a provocative pose.

One of the artificial intelligence-driven apps, Facemega, was removed from both the Apple App Store and Google Play this week following our probe.

Yet online monitoring group App Magic estimates the app has already been downloaded more than a million times since its launch last year.

Use of AI-driven fake videos and photos has increased by 900 per cent since 2019.

Carolyn Bunting, CEO of child cyber safety organisation Internet Matters, told us: “The creation of sexualised, non-consensual deepfakes across these apps is incredibly disturbing, as is the impact this kind of content can have on children.

“Using someone’s image in a sexual way without their consent will be extremely harmful for the child. It can lead to complex and lasting issues affecting their well-being.”

Before Facemega was taken down from the app stores it had climbed to 77th place in the entertainment chart — above Lego. It cost £7.49 a week and was rated as suitable for ages “nine and up”.

While putting young lives at risk, it has made millions of pounds for both the app stores and the developer, Ufoto Ltd, owned by China parent company Wondershare.

Users are never asked to verify their age when accessing the image-altering tech, but the choice of videos to graft a face on to include scantily clad women in bikinis and a section entitled Hot.

Most read in Tech

Within ten seconds of uploading your selected mug shot, AI wizardry matches it to a different body with often alarming results.

Following our probe, the developer of Facemega removed the Hot and For Women categories — which contained sexually provocative videos — from its app.

Similar apps to Facemega remain on mainstream platforms, weaving their worrying web of twisted reality.

Face Swap Video, by Deep Fake, created by US firm Deepfaker LLC, was being advertised on the App Store this week as suitable for kids aged four and up. The ad shows a young woman having her face swapped on to another person’s social media picture — though not in a sexual way.

A video then starts rolling, leaving it hard to tell the difference between what is real and what is fake.

A three-day free trial leads to a £7.99 a week subscription.

On Faceswap, also listed for ages nine-plus on the App Store, kids can access deepfakes for free before they are required to pay a £19.99 annual subscription.

Our revelations come just months after ministers announced deep-fake pornography will be targeted, with the unauthorised creation and sharing of images made illegal under its Online Safety Bill, which has been passed but not yet enacted.

Children’s charity the NSPCC called for the Government to put a legal duty on the major app distributors to help protect the targets of these apps, especially women and girls.

Rani Govender of the NSPCC, added: “App stores have an important role to play in preventing the risks of deepfake technology at source. The Government can also act through its Online Safety Bill by adding a legal duty on companies to tackle violence against women and girls online.”

Communications regulator Ofcom last year said fake or deceptive images and videos are ranked in the top 20 online potential harms faced by UK internet users.

Teaching platform Safer Schools says the number of deepfakes online grew from roughly 14,000 to 145,000 between 2019 and 2021 — a 900 per cent increase. Of those, 96 per cent contained pornographic material while around 90 per cent involved indecent images of young women.

The NSPCC’s Ms Govender added: “Deepfake technology is already having an insidious impact on children as it becomes increasingly easy to produce and share this degrading and damaging material.

“This rapidly advancing technology is fast becoming a child abuse risk as it is rolled out without proper consideration of the way in which it fuels intimate image abuse.

“Girls and women suffer the most from apps like this, which exist within a toxic online culture of misogyny that is growing at a disturbing rate.”

Apple said it had removed Face Mega from the App Store and said it had no specific rules on deepfake apps. It claimed to prohibit apps that include pornography, defamatory or discriminatory content.

A Google Play spokeswoman confirmed it had taken down Face Mega from it’s platform but did not comment on other apps.

Read More on The US Sun

Tory MP Siobhan Baillie called deepfake technology terrifying and added: “Clearly age verific- ation and additional protections must be considered.

“I salute The Sun on Sunday for getting this app removed from the Apple App Store and Google Play. Our children need to be protected from the deep fake menace.”

Three victims tell their story

CHILDLINE has shared details of three teenagers who have been threatened with fake videos and photos, as the charity shows how traumatic it can be.

One 14-year-old told how she was threatened online with having a fake video of her made if she refused to send an abuser nude images.

They said: “I was being friendly, just small-talking to someone on Snapchat. They asked what I looked like so I sent a picture of my face, then they kept asking me for nudes.

“I told them no but they said if I don’t they will edit my face on to nudes and sell them. I know I should report them but it won’t change anything, as they will still have my photos on their camera roll. Please help, I’m really worried.”

A terrified 13-year-old said: “Someone I know is threatening to post a fake nude and claim it’s me if I don’t send her actual nudes.

“She says she will tag my friends and show them that ‘it’s me’. I haven’t ever sent nudes before and I worry my real friends will judge me if this happens.

“I met this person online and we used to be friends, but we haven’t spoken for a while. I don’t understand why she’s doing this to me. I don’t know what to do.”

A third teen told Childline they had got police investigators involved over fake porn pictures.

He said: “I feel so embarrassed and angry. Someone has created a fake account on Instagram that is under my name and they’ve put inappropriate pictures (porn) with my face on them.

“I’ve reported the account and the police are trying to track the person down. I feel a bit safer knowing that but I’m worried about my friends finding out and me getting bullied for it.”

ncG1vNJzZmivp6x7tbTEZqqupl6YvK57056aoWdna315fJZwZqyhk6B6tLHXrpiloaOasW6ywJycZqunlr1urc%2Bpqmaskae0psDEnWScoJmhsbOxzWg%3D