Personal Attention.
Aggressive Defense.

Photo of Thomas C. Mooney

Predictive policing may produce significant bias

On Behalf of | Dec 20, 2023 | Criminal Defense

Police are not supposed to exhibit bias when patrolling, making traffic stops, making arrests or other things of this nature. For instance, a police officer is not supposed to target people of a certain ethnicity, as that could constitute unfair treatment by the authorities.

Predictive policing uses computer models to determine where crimes may occur in the future. This way, the police may dispatch units to that location in advance, preventing the crime or making an arrest. Many people think that a computer cannot exhibit any bias, but the reality is that predictive policing has an inherent flaw.

The data comes from individual officers

Critics of predictive policing actually say that it is very biased and that it should not be used. The reason for this is that individual police officers input the data that trains the system or the algorithm, telling it where crime might happen.

For example, perhaps a police officer has a bias toward people from a certain racial group. This officer also knows the neighborhoods where these individuals tend to live, so he or she patrols these areas more severely and makes more arrests.

That data is fed into the algorithm, which predicts that these specific individuals, in these neighborhoods, are far more likely to break the law than people in other areas. The computer then sends more police officers to this area, resulting in even more arrests and creating a feedback loop. The algorithm starts to intentionally target these areas, but the root cause isn’t the algorithm itself – it’s the initial racial bias by the police officers who trained that algorithm.

Do you believe you’ve been treated unfairly by the police while being arrested or that they targeted you based on things like your ethnicity, racial background or national origin? If so, it’s quite important to know about all of your legal options.