Personal Attention.
Aggressive Defense.

Photo of Thomas C. Mooney

Is predictive policing biased?

On Behalf of | Nov 18, 2023 | Criminal Defense

Predictive policing is sometimes referred to as the “future” of law enforcement. It essentially uses computers to run algorithms that will predict where crime is going to happen. Instead of just reacting to events that have already taken place, police departments can send officers to these areas to stop crime in its tracks.

One of the potential benefits of this is that it shouldn’t, in theory, be biased against anyone. A computer can’t hold racial prejudice or be biased against a certain ethnic group. The algorithm just runs the numbers and makes predictions. But some claim that bias is one of the biggest problems with predictive policing. How does this happen?

Biased algorithm training

The issue is essentially that the algorithm has to be trained with data that is input by the police department. If this training process is biased, then the results from the algorithm could also be biased.

For example, say that a police officer is inputting data to tell the algorithm where crimes have already taken place and who committed them. But that officer has racial prejudices against a certain ethnic group and tends to arrest them more often than anyone else.

The algorithm sees this data and predicts that people in this ethnic group – or in these neighborhoods – are going to be more likely to break the law. It dispatches units to these locations, they make arrests, and it creates a feedback loop. Pretty soon, the original officer’s bias shows through in the algorithm, which keeps telling police officers to arrest the same people in the same locations.

This doesn’t mean predictive policing will never work, but there are inherent problems with it. Those who have been arrested need to be aware of their legal defense options, especially if they think that any sort of bias or prejudice led to the arrest.