As a former policeman I’m always interested in articles relating to crime investigation. So, the Telegraph’s report last month that the Metropolitan Police will start using AI to scan suspects’ photographs in search of child abuse was of particular interest.
For years, police officers and civilian staff of the Obscene Publications Branch (OPB) have searched through images and videos in search of evidence. Now with digital technology, the sheer quantity of images, videos and devices that need to be searched is mind-blowing. And we are not talking about Playboy style images. But sickening images that show mental and physical torture, mutilation and occasionally death. The psychological impact on those who carry out this work cannot be underestimated so, for computers to take over the job is a welcome move forward.
Sand dunes or flesh?
The Met’s Hi-Tech Crime Unit already uses software to identify drugs, guns and money, but has a slight problem with recognising nudity.
"Sometimes it comes up with a desert and it thinks it’s an indecent image or pornography,” said the Met's head of Digital and Electronics Forensics Mark Stokes during a summit in London.
With the help of cloud services providers like Google or Microsoft, the unit hopes to develop an AI system that can get around this problem. Also, storage space is becoming an issue. As file sizes grow, the Met’s servers and budget limitations are feeling the pressure. A fly in the ointment is that nobody apart from the police are legally authorised to store these kinds of images. Also, there is the risk of hackers gaining access to cloud storage accounts and stealing files. There have been various cases of celebrities having their private images hacked and put online. However, the Met has been in discussions with cloud service providers and there is light at the end of the tunnel.
While the undulating curves of sand dunes might be evocative, let’s hope the Met resolve their issue.