Our new taskforce
We’ve established a new taskforce of analysts, recruited thanks to a grant from Thorn.
These analysts will view, classify, and hash two million Category A and B images from the UK Government’s Child Abuse Image Database (CAID).
- Category A images involve penetrative sexual activity and sexual activity with an animal or sadism.
- Category B images involve non-penetrative sexual activity.
These images have been collated from computers of suspects and offenders during police investigations. Many of the images of child sexual abuse may not have made their way onto the internet.
By assessing, hashing and classifying these images, and sharing those hashes with our partners, we hope to prevent those images from ever being uploaded online, or shared onwards.
To make this job possible we’ve developed the IWF grading tool. It means our analysts won’t ever have to look at the same image twice, protecting their welfare.