Our bespoke classifier

Artificial intelligence is becoming increasingly important to enhance our analysts’ capability, but it is vital that we also have human eyes assessing potentially criminal content.

We’re creating a bespoke classifier to help us triage images. This technology uses machine-learning to signal to us which reports are most likely to contain child sexual abuse material, and which might not. It will allow our analysts to prioritise their work and enable us to focus on those reports which include images of the youngest children being sexually abused.

Our developers have been working with NVIDIA experts to develop our own IWF models and algorithms. This is being done on a state of the art, custom, Deep Leaning Data Science Workstation. This leverages optimised NVIDIA machine learning and artificial intelligence software and hardware, and combines them with highly-accurate data produced by our own analysts.

Integrating our crawler with our classifier makes for a powerful tool; we’ll be able to find images online that we’ve not seen before, triage them through our classifier which will tell us the likelihood of them being child sexual abuse, and then allow our analysts to make their expert – granular – assessments.

“We can’t ever be in a situation where we are relying solely on machines. AI simply has not got to that stage yet and it can’t really make those granular distinctions. And in terms of identifying victims, it is vitally important that we have human eyes making assessments and judgments on the content.”

Sarah Smith

Technical Projects Manager, IWF