Following in the footsteps of the US, police in the UK are starting to use predictive technology to foresee the type, time and place of a crime and deploy officers to prevent it.
The UK police are using technology and copious amounts of data to predict crime. Thousands of anonymised reports of previous offences allow police to be deployed where crime is predicted to occur over the next 24 hours. The time and place of previous criminal activity helps to determine future offences. Kent Police are at the forefront of this with their system PredPol.
Alexander Babuta, who carried out research into predictive technology commented:
“The software itself is actually quite simple – using crime type, crime location and date and time – and then based on past crime data it generates a hotspot map identifying areas where crime is most likely to happen,
In the UK, forces have found that it’s about 10 times more likely to predict the location of future crime than random beat policing. It allows you to allocate a limited number of resources to where they’re most needed.”
A System Inspired by Earthquake Prediction
The system works like an algorithm to predict earthquakes. According to the Santa Cruz police department, offenders criminalise in familiar areas which creates detectable patterns associated with the time and location of crimes.
Postdoctoral scholar George Mohler suggested some crimes are caused by built-in features of the environment, like the bar that closes at 2 a.m. every night and releases rowdy drunk onto a neighbourhood. Others, such as a series of gang murders or a rash of neighbourhood burglaries, happen because criminals’ success invites more crimes or incites retaliation. Criminologists call this “repeat victimisation”—the criminal equivalent of aftershocks.
Problems?
Kent Police said that their PredPol system generated controversy after research showed it appeared to be repeating systematic discrimination against black and ethnic minority offenders. Civil liberty groups argue predictive policing perpetuates racial prejudice in a dangerous new way and can claim its legitimacy according to technology. It is argued that this reflects the inherent bias in the criminal justice system.
Privacy is also a concern. The system can generate a list of individuals deemed likely to perpetrate or be victims of crime. Those people may have done nothing wrong but will be assessed in the machine.
Some are sceptical about using machine learning for predictive policing. Predictive policing may miss other important data with a focus on previous crime reinforcing the negative stereotypes of an area.
What are your thoughts about the expansion of predictive policing? We would love to hear your opinions below.
Leave a Reply
Want to join the discussion?Feel free to contribute!