‘AI Deepfakes Fuel Child Sexual Abuse Content’

1 min read

• Unicef says children’s photos are being manipulated and sexualised through AI tools
• In some countries, as many as one in 25 children reported having their images turned into sexual deepfakes

ISLAMABAD: Unicef has said it is increasingly alarmed by reports of a rapid rise in the volume of AI-generated sexualised images circulating online, including cases in which photographs of children have been manipulated and sexualised. The organisation has urged governments and industry to prevent the creation and spread of AI-generated sexual content involving children.

“The harm from deepfake abuse is real and urgent. Children cannot wait for the law to catch up,” Unicef said in a statement released by UN Information Centre in Islamabad on Thursday.

“Deepfakes – images, videos, or audio generated or manipulated using Artificial Intelligence (AI) and designed to look real – are increasingly being used to produce sexualised content involving children, including through “nudification,” where AI tools are used to strip or alter clothing in photos to create fabricated nude or sexualised images.

Unicef said this unprecedented situation poses new challenges for prevention and education, legal frameworks, and response and support services. Current prevention efforts, which often focus on teaching children about online safety and the risks of creating or sharing sexual images, remain important but are insufficient when sexual content can be artificially generated.

The statement said the growing prevalence of AI-powered image and video generation tools that produce child sexual abuse material marks a significant escalation in risks to children through digital technologies.

Recent large-scale research conducted by Unicef, ECPAT and Interpol under the Disrupting Harm project showed that across 11 countries, at least 1.2 million children reported having had their images manipulated into sexually explicit deepfakes through AI tools in the past year.

“Children themselves are dee­ply aware of this risk. In some of the study countries, up to two-thirds of children said they worry that AI could be used to create fake sexual images or videos. Levels of concern vary widely between countries, underscoring the urgent need for stronger awareness, prevention and protection measures,” the statement said.

“We must be clear. Sexualised images of children generated or manipulated using AI tools are child sexual abuse material (CSAM). Deepfake abuse is abuse, and there is nothing fake about the harm it causes.

“When a child’s image or identity is used, that child is directly victimised. Even without an identifiable victim, AI-generated child sexual abuse material normalises the sexual exploitation of children, fuels demand for abusive content and presents significant challenges for law enforcement in identifying and protecting children who need help.”

Unicef welcomed the efforts of AI developers who are implementing safety-by-design approaches and robust guardrails to prevent misuse of their systems.

Published in Dawn, February 6th, 2026.

Previous Story

Children Betrayed

Next Story

Teen Shot Dead in Police Chase

Latest from Blog

Sindh Rates Poorly in Household Survey

KARACHI: A recently conducted government-commissioned survey has revealed that Punjab has better education and health facilities compared to other provinces, while Sindh and Balochistan remain the most underdeveloped province in this regard. These figures come from the Household Integrated Economic Survey 2024-25, conducted by the federal institution, the Pakistan Bureau…

Teen Shot Dead in Police Chase

PESHAWAR: A 16-year-old boy was fatally shot when police opened fire on a vehicle that failed to stop at a checkpoint in Hayatabad’s Industrial Road area, triggering outrage among locals and prompting a road blockade protest by the victim’s family. According to police and family accounts, the incident occurred when…

Children Betrayed

JUST when we thought Pakistan had made meaningful progress and the debate on child marriage was nearly settled, a spanner has been thrown into the spokes of reform. Four years after the Federal Shariat Court’s ruling in 2021 that setting a minimum marriage age is not in contradiction with Islam,…

Four-Year-Old Girl Under Treatment in Lahore after Suspected Sexual Assault

LAHORE: An unidentified four-year-old girl is currently receiving medical treatment at Lahore General Hospital (LGH) after being shifted from Kasur in a suspected case of sexual assault, officials from the hospital and police said on February 4. The child was initially brought to the District Headquarters Hospital in Kasur by…

For Distressed Children, Railway Stations Become a Refuge

LAHORE: Railway stations across Pakistan are increasingly becoming refuge points for children fleeing troubled homes, as poverty, domestic tensions, corporal punishment, and the influence of social media push minors onto the streets. According to official data shared by Pakistan Railways Police, 658 children, including 413 boys and 245 girls, were…
Go toTop