Personal Attention.
Aggressive Defense.

Photo of Thomas C. Mooney

Police computer models may still be biased

On Behalf of | Mar 26, 2024 | Criminal Defense

Police officers will sometimes use computer models to predict where crime is going to take place. This is called “predictive policing,” and it is essentially just an analysis of the relevant data. The computer can look at statistics for the types of crimes that occur, who tends to commit them, where they take place and when they may happen. Police officers can then be dispatched to these locations to prevent the crime from taking place.

Supporters of the system claim that it offers a lot of different advantages. One is that a computer isn’t going to be biased. Police officers themselves may be, if they have prejudices against people based on their race, ethnicity, income level and other such factors. These officers may unjustly arrest people in these specific groups or target them unfairly when compared to their peers. The benefit of a computer system is that it should be relatively fair because it is just looking at statistical data.

So why doesn’t this work?

Unfortunately, research has found that computer models are still biased. They can still lead to unfair arrests and biased results, flagging specific areas at hotspots for crime. The system isn’t nearly as fair as assumed.

The reason that it doesn’t work is because the statistics given to the computer still come from the police officers themselves. So if an officer is biased and arrests people in a certain ethnic group far more often than others, the computer model is going to reflect that bias. Even crime reports show this bias, not just arrests. Over time, the computer can even amplify this issue and make biased policing worse.

Do you feel that you were arrested due to biases by the police force, perhaps in violation of your rights? If so, you need to know exactly what defense options you have.