The LCFI website uses cookies only for anonymised website statistics and for ensuring our security, never for tracking or identifying you individually. To find out more, and to find out how we protect your personal information, please read our privacy policy.

Beyond Use-Cases: Understanding Challenges and Opportunities of Data Science in Policing

Beyond Use-Cases: Understanding Challenges and Opportunities of Data Science in Policing

This blog was written by Dr Miri Zilka, a CFI Associate Fellow. It is based on her paper, Beyond Use-Cases: A Participatory Approach to Envisioning Data Science in Law Enforcement, co-authored with Caitlin Kearney, Jiri Hron and Helen Kosc, published on June 3, 2024 and presented asynchronously at FAccT 2024.


In an era of shrinking budgets and increasing demands, law enforcement agencies are turning to data science for solutions. However, the adoption of tools like facial recognition and predictive policing has sparked controversy and raised concerns about bias, accuracy, and potential misuse.

While most academic studies in this area focus on specific tools, we took a different approach, investigating how strategic decisions about where, when, and how to employ data science are made in policing. We spoke to 40 practitioners from Police Scotland, in a series of in-person and online workshops and interviews, asking them to describe what failed and successful futures of data science would look like in their organisation, and to discuss barriers to achieving success.

Our participants converged on four possible ways integration of data science may fail in their organisation:

  1. Technological stagnation. The most mundane and collectively shared failure case is where Police Scotland continues to rely on its already outdated systems. All the current fissures widen, and the police become increasingly unable to prevent and respond to crime.
  2. Veneer of success. Here Police Scotland deploys several new data science tools, but after the initial optimism, systemic issues surface. These may include unreliability, bias, and overall lack of trustworthiness of either the data or the insights.
  3. Insights ≠ better decisions. In this scenario, new tools are adopted, provide relevant insights for specific use cases, and their shortcomings are well documented. The issue lies in translating these insights into better policing outcomes. This future considers how adoption and reliance on future tools may change the officer role, reduce decision-making capacity and traditional investigative work skills, and increase data work and overall job dissatisfaction.
  4. Automation removing steps vital for public confidence. This case is about the impact of automation on public confidence. As Scotland residents, our participants were concerned about how the public trust could be impaired.

When asked to describe how a positive future could be achieved, participants primarily talked about ensuring the above pitfalls are avoided. They further mentioned:

(i) low-hanging fruit–several simple existing applications whose adoption would improve their day-to-day;

(ii) participatory practices in creation of new tools to ensure consideration of policing craft, operational context, and regional requirements, with representation from ‘everybody that’s going to put their hands on it’ (officer quote);

(iii) training in how data science can support officer work.

Instead of specific applications, the discussions coalesced around higher-level issues such as (i) systemic challenges around data collection and use, (ii) goal misalignment between leadership and operational levels, (iii) fear that datafication may undervalue important aspects of policing, and (iv) appropriate ways of interaction between data science teams and operational officers.

Bucking the external trend, our participants showed little appetite for incorporation of tools like facial recognition and risk assessment. While more affected by pressures to innovate and improve efficiency, leaders were as sceptical towards predictive technologies as the officers. Public expectations and consent played a significant role: 

‘Everyone wants to talk about facial recognition... [but] we don’t have the consent to use it from the [Scottish] public. If it impacts public confidence, we won’t do it’ 

(leadership quote)

Another concern specific to Police Scotland is how incorporation of data science may impact their focus on community-oriented policing. Officers expressed fear that datafication can lead to reduction or loss of preventative work which is a key aspect of community-oriented policing. More generally, officers felt community-oriented policing can be hurt if officer time with the members of the community is impacted:

‘We had a great cop … the communities loved him, because they all knew who he was, they could all approach him, they knew how he would deal with things, and he had a way about him that on our computer systems there’s no way of logging … [N]ewer cops coming in with the new systems and new things in place didn’t see the benefit of what he did.’

(Officer Quote).

Nonetheless, our participants generally felt utilisation of data tools should be increased within the organisation to avoid being left behind. Officers’ doubts that positive change can be achieved are primarily rooted in negative past experiences with data-driven approaches, characterised by misaligned incentives, cycles of hype, and ultimate disappointment. Past over-reliance on metrics in particular has led to undermining of officers’ perceived autonomy, and damaged public trust. Current leadership, aware of past failures, remains sceptical of predictive technologies, and is instead more interested in using data to provide organisational insights.

To learn more about our study, what we learned, and how our work fits with existing findings, search for ‘Beyond Use-Cases: A Participatory Approach to Envisioning Data Science in Law Enforcement’  in the accepted papers section of FAccT 2024.

Next article

The EU AI Act’s Technical “Tension Areas”