Personal Attention.
Aggressive Defense.

Photo of Thomas C. Mooney

Why doesn’t predictive policing remove bias?

On Behalf of | Mar 28, 2025 | Criminal Defense

It is well known that some police officers have biases. An officer may be more likely to arrest people of a certain race or ethnic background, for example. While it is technically true that the police should treat everyone fairly and equally, this does not always happen.

One potential way to combat this issue is to use computers that analyze data, a method often referred to as predictive policing. An algorithm can look at arrest data and other details, determine where criminal activity is most likely to occur, and then direct officers to patrol those areas.

Unfortunately, opponents of these systems claim that they are still highly biased. Why would this happen?

Reflection and amplification

The issue is that the predictive policing model simply reflects the data it is given. In some cases, it can even amplify preexisting biases, making trends appear more pronounced.

For instance, consider a police officer who holds a racial bias against certain suspects. That officer may naturally arrest more people of a specific racial background. The data is then fed into the predictive policing model, which determines that people from this community are more likely to commit crimes.

But that isn’t necessarily the case. In reality, the algorithm is just amplifying the biased data it received from the officer. This results in more patrols being dispatched to that area, leading to more arrests and creating a vicious cycle of targeted policing.

Your legal rights

This demonstrates some of the modern challenges with technology and law enforcement. If you have been arrested, it is important to understand your legal rights and the defense options available to you.