âP redators are often early adopters of technology,â says Sarah Smith, chief technology officer at the Internet Watch Foundation (IWF), a UK child abuse hotline. âItâs an arms race, we have to be constantly horizon-scanning.â
Smith and her team, based in an unassuming office in Cambridge, are a key link in a chain of experts around the world developing and finessing technology that tracks down paedophiles and removes child abuse images found online.
IWF analysts sit in front of screens for long hours each day, trawling through material flagged to their hotline by the public and police as potentially containing child abuse.
The volume of images reported to them is increasing all the time, driven partly by the trend for predators to befriend children online and coerce them into sharing sexual images from their own bedrooms.
âWe only have 13 analysts and the internet is a huge place,â Smith says, âso we need to triage results for them to take action on. We have a âcrawlerâ that moves around the web trying to find child abuse material.â
The vast majority of what they find, both through reports to their hotline and their own investigations, is on the open internet, rather than the dark web.
Images are analysed and categorised according to severity of abuse or the age of the children involved. Then the experts turn them into âhashesâ, which Smith describes as âa unique digital fingerprintâ.
Sarah Smith of the Internet Watch Foundation. âWe only have 13 analysts and the internet is a huge place.â Photograph: Antonio Olmos/The ObserverâEach image becomes a string of letters and numbers unique to that image â but from that string of information you canât reverse-engineer the original image,â she says. This means the image canât be recreated using the data attached to it. After the IWF analysts view the image and create a âhashâ, nobody else in the chain tracking and monitoring the images has to view them again.
The ever-growing hash list is given to internet operators, from Facebook to Yahoo, so they can scan messaging services to look for it. Such âphoto DNAâ is becoming increasingly sophisticated and can identify known images even when they have been altered.
âThis works where predators may change one pixel to avoid detection, or with one image from a series â we can find all of them from the one image that we have,â Smith says.
IWF also shares a constantly updated list of keywords employed by paedophiles. This can be used to filter results in search engines, analyse conversations in messaging services or moderate chat as people play games.
âKeywords can also identify sites where paedophiles are sharing newly created images because they will be having conversations using these words,â Smith says. âIf we can identify those words and then find these sites, that is a high-priority target. Unlike with historic images, here children may still be at risk and there are safeguarding opportunities.â
Once a child is safeguarded by police and social services, efforts go into finding any images of them that are circulating online.
Iâve been doing this job for 11 years and I still see material I was seeing when I started. We know how traumatic this is for victims Sarah Smith, IWFâIâve been doing this job for 11 years,â Smith says, âand I still see material I was seeing when I started. We know how traumatic this is for victims. We have spoken to children who say that if they are in a shop and they think they are being recognised, they wonder â has that person seen an image of me online?â
The IWFâs next project aims to reach men before they go down the path of offending, as the number of men looking at images of children being abused continues to grow. Last year between April and September UK police arrested 4,700 people, almost all men, in connection with online child sexual abuse, more than 300 of whom were in the most serious category of offender, actively grooming more than one child.
Child abuse hotline reports rise in calls from men viewing illegal content Read moreThe latest tool is a chatbot, designed in partnership with the Lucy Faithfull Foundation, a charity dedicated to preventing child sexual abuse that works directly with paedophiles in the UK.
âWe will use data to identify an internet user who is potentially at risk of either starting to seek or encountering this type of content, and a chatbot will target them and will tell them this is risky behaviour,â says Smith. âThey can be offered links to follow and resources to prevent them going any further.â
Q&AAs part of our Rights and Freedoms project, we will be investigating how rapid advances in data-intensive technologies are affecting human rights around the world, for better and for worse.
Under the cover of the pandemic, many governments have used digital technologies to track and analyse citizens' movements, quash dissent and curtail free speech â while on digital platforms truth has been manipulated and misinformation spread.
But we'll also look at how technology can be a powerful force for hope and justice in the battle to preserve rights and freedoms in the face of rising authoritarianism.
Was this helpful? Thank you for your feedback.Yet even as experts look at improving their technology, the tools they are using to fight online child abuse are at risk from demands for increased privacy online.
Monitoring technologies and artificial intelligence (AI) systems operate beneath the surface of most major internet sites, constantly scanning for signs of child exploitation, from images of children being abused to the codewords used by paedophiles as they share images.
When suspicious material is detected, an electronic tipoff is sent to the National Center for Missing and Exploited Children (NCMEC) in the US, which analyses it and passes it on to national child protection teams around the world.
In 2019, internet service providers sent 17 million tipoffs to NCMEC.
Last month the British paedophile David Wilson was jailed. He used Facebook to target and abuse children and the siteâs tracking systems picked up his activities.
In the Richard Aldinger case, a 39-year-old woman was arrested in the Philippines and a 12-year-old girl was rescued. Photograph: AFPFacebook is preparing to fully encrypt its Messenger service, bringing it in line with WhatsApp and Instagram. Child protection experts fear the loss of millions of electronic tipoffs. Facebook founder Mark Zuckerberg has called encryption a âpivot to privacyâ, stating that protection of privacy online is what internet users are most concerned about.
But child protection experts are worried about the impact it will have on their efforts. Smith says: âIt will be like turning the lights out, the potential implications arenât being considered.â
Facebook has responded robustly to criticism from senior police officers and experts over encryption, saying â[We have] led the industry in developing new ways to prevent, detect, and respond to abuse. End-to-end encryption is ⦠used by many services to keep people safe online and, when we roll it out on our other messaging services, we will build on our strong anti-abuse capabilities at WhatsApp. For example, WhatsApp bans around 250,000 accounts each month suspected of sharing child exploitative imagery.â
But child protection experts say that what is needed is greater use of technology to track offenders and child abuse material.
We could within three years have a safer internet. It will take global resolve, but it is doable John Tanagho, International Justice MissionIn 2019, federal police in Australia got a tipoff from the NCMEC that a man in New South Wales was posting child abuse images online.
Police in Australia tracked the images to Richard Aldinger, a 63-year-old father-of-two, and arrested him at his house in Sydney. Trawling through his devices they found that as well as sharing images of children being abused online, he had been directing the rape and abuse of a 12-year-old girl in the Philippines for two years through a livestreaming service.
The girl was rescued and Aldinger is now in jail. But it was only through the images he shared with others that he was caught by scanning programmes, a common âslipâ by predators that can lead to their downfall.
John Tanagho is director of the International Justice Mission (IJM), based in Manila. The IJM was involved in the case of Richard Aldinger and works closely with police in the Philippines to protect children from live-streamed abuse.
âWe know technology is making it easier for people to abuse children,â he says. âWe need to improve safety technology, and itâs urgent. We seeing very young children, of five or six, abused through livestreaming.â
The city of Rizal, about two hours drive from Manila, where the 12-year-old girl was abused in the Aldinger case. Photograph: AFPAldinger paid just AU$1,075 (£600) in total to the girlâs mother to facilitate her rape and abuse â about AU$80 (£45) each time. Such small sums might not usually trigger investigation by a money transfer service, but Tanagho thinks more could be done in this area.
âThese are payments from a 63-year-old in Australia to the Philippines where he has no family,â he says. âWe know the Philippines is a hotspot for child exploitation. We could do what we call a âcross-sectorâ match on a user with this profile who is transferring money, looking at whether he was also engaging in video calls an hour before or after. This happens already with terrorism financing.â
Tanagho wants internet users to understand that protecting children doesnât mean companies intruding on individual privacy.
âThe tools that are being used to detect child sexual abuse, they are really targeted artificial intelligence tools, built up through training them on actual child abuse material. Itâs not like these scanning programmes are looking through peopleâs general videos.â
He believes that despite the rise of online child abuse, there is reason to be optimistic. âI donât think the picture is bleak,â he says, citing the online harms bill in the UK that will put responsibility on social media giants to protect children. âWe could within three years have a safer internet. It will take global resolve, but it is doable.â
When it comes down to it, he says, whose privacy matters most â that of the child, or that of the abuser? âThe privacy of children who are sexually abused, their right for those images to be removed from the internet, what could be more important than that?â