Police agencies and investigators worldwide are facing a growing challenge as artificial intelligence (AI)-generated images of child pornography become increasingly realistic. The Internet Watch Foundation (IWF), an advocacy group based in Cambridge, England, has warned that AI technology enables the creation of “unprecedented quantities” of lifelike child sexual abuse images and videos. These images, some featuring children as young as three years old, have the potential to deceive law enforcement and waste valuable resources.
IWF Chief Executive Susie Hargreaves has urged UK Prime Minister Rishi Sunak to allocate more resources to combat this issue. Hargreaves emphasized the potential devastating impact on internet safety and the safety of children online. The National Crime Agency (NCA) also acknowledged the increasing problem and stressed the need to take it “extremely seriously.” The NCA’s Director of Threat Leadership, Chris Farrimond, expressed concerns about the potential strain on law enforcement resources if the volume of AI-generated material continues to rise.
While AI-produced child pornography is already illegal in many jurisdictions, including the UK, the rapid advancement of AI capabilities poses a significant challenge to policing efforts. The UK government is working on expanding criminal laws and placing more responsibility on internet providers to shut down sources of such content. However, AI and other software are being used by child abusers to evade detection systems developed by police and governments.
The misuse of AI tools by pedophiles has created what experts describe as a “predatory arms race” on dark web forums. The ease of using these tools, coupled with their realism, presents a significant challenge for law enforcement agencies in identifying victims and prosecuting offenders. Private companies developing AI technology struggle to prevent the misuse of their products, as these tools can be copied and used without safeguards in place.
The FBI has also expressed concern about the rise in AI-generated pornography. In June, the agency reported an increase in reports involving children whose photos were altered into sexually explicit images that appear realistic. Advocates warn of the “horrible societal harm” caused by these images, as they are used to groom children for abuse by combining parts of real children to create fake but highly realistic images.