Top Banner

UK regional institutions to implement AI in the hopes of crime prevention

Local councils and police forces in UK cities are using personal data from the public to form algorithms which could track crime against children.

Research conducted by Cardiff University and Sky News shows that around 53 local councils and 45 police forces rely quite heavily on computer algorithms to determine the risk levels associated with crimes against children and people cheating on benefits.

The use of this technology has left people feeling uncertain about the effectiveness and ethical implications that come with it.

The algorithms used were obtained from IT companies that utilized personal data to train the AI system to forecast how likely a child is to be subjected to crime depending on their social environment, providing each child with a score between 1 and 100 as well as ranking their individual risk levels on the scale of low to high. The outcome is then used to alert social workers to intervene before a crime is committed. This system, to some extent, is quite similar the China’s famous Social Credit system in the sense that it ranks people however, it does not include benefits for the children such as entering better schools as a reward for good behavior.

Last year, The Guardian found that data from over 377,000 people were used to train algorithms for a coinciding purpose. In fact, the research conducted by Cardiff University also found that in the city of Bristol alone, data about benefits, school attendance, homelessness, crime, teen pregnancy, and mental health from 54,000 families were collected and used to determine which children were more likely to be susceptible to sexual abuse and domestic violence.

The IT system for the Universal Credit scheme failed to correctly assess the use of benefits in the UK. Last week, computer-generated letters were sent to residents in certain areas warning them that their benefits could be withdrawn because they were caught cheating. However, most of the warnings that were sent turned out to be wrong.

A spokesperson from a private advocacy group commented on the issue and told Sky News: “While it’s advertised as being able to help you make a decision, in reality it replaces the human decision. You have that faith in the computer that it will always be right.”

Cardiff University researchers on the topic stated that “there was hardly any oversight in this area.” Indeed, intervening in situations can often have an adverse effect on a child’s development especially if the child is removed from their families when it is not necessary.

The accuracy of the technology poses a very serious issue. Kent’s police department trusted that 98% of the cases pursued by the algorithm were accurate.

While the use of AI in local councils and police forces can be beneficial to the regional institutions from a financial point of view, especially because they have been under some pressure due to spending cuts, the results should not be taken at face value. Instead, they could be used as helpful references to solve potential problems rather than definite truths.