The LCFI website uses cookies only for anonymised website statistics and for ensuring our security, never for tracking or identifying you individually. To find out more, and to find out how we protect your personal information, please read our privacy policy.

New Research: Covert Use of Predictive Policing Software Raises Concerns

The Citizens discovered that UK gov are now deploying software used to shut down BLM protests in the USA. Research by LCFI’s Dr Eleanor Drage and co-author Dr Federica Frabetti of The University of Roehampton shows that this software is more likely to flag crowds of Black and Brown people as a potential threat!

Findings show that UK police are likely using it as a predictive policing tool. But rather than predict future crime, it reflects patterns of past discrimination and further embeds them into police practice. Far from being a neutral observational tool, Dataminr is more likely to send law enforcement clients an alert about gatherings that politically oppose the government. It even claims itself to offer clients (eg. the police) the « lens » that suits their worldview.

Based on ML, Dataminr “learns” to identify a protest by scanning data and making associations based on past events, according to law enforcement’s existing concerns and perceptions of what a dangerous protest is. In 2016 Dataminr worked with South African law enforcement to monitor students at the Shackville protests, which were asking for Black students to have access to decent university accommodation at the University of Cape Town. 

Despite the American Civil Liberties Union forcing a partial ban on Dataminr’s use by US police, there are no limitations to how Dataminr is used in the UK. The authors are concerned about the lack of transparency and oversight in how AI is used in policing. They point to current policing context, notably the Public Order Bill’s aim to shut down protests like Just Stop Oil’s “slow marching” before disruption begins. The Bill is still firmly contested in the House of Lords.

For more information, the Byline Times article can be found here 

And the full journal article can be found here

Next article

Who makes AI? Inequality in AI Films