When predictive policing software was first developed, it was praised as a way to take bias and discrimination out of police work. Unfortunately, it did the exact opposite.
In general, this software just tracks data related to arrests and alleged criminal activity. It can then run these numbers through an algorithm and determine when and where crime may happen. There’s no guarantee, but by finding these “hot spots” police can go there to make arrests or deter crime.
You can see how this would appear to reduce bias since the computer doesn’t hold any of its own biases and the police officers themselves aren’t even deciding where to go. But there’s one major problem.
The software gets its data from the officers
The problem is that the data itself has to come from somewhere. If there are any biases in that data, the system reflects them — at best. At worst, it amplifies them and makes them more pronounced.
For example, critics point out that arrest rates are not nearly the same for white Americans and African Americans. Someone who is a minority may face twice the odds of being arrested, based on these rates.
If the arrest rate for minority groups is higher because the officers making the arrests are biased against those groups, the data is instantly skewed from the moment that it enters the system. It is not a fair representation of how crime actually happens.
The algorithm then takes this data, reflects and amplifies the bias, and suggests that more people in these same areas are going to break the law. This dispatches police officers only to these areas, where they make more biased arrests and feed that data back into the system. In the end, the computer is operating in just as discriminatory of a fashion as any human officer ever could.
Have you been profiled and arrested?
Police profiling is a huge issue that impacts many communities. If you have been profiled and arrested, be sure you know what legal defense options you have. You need a legal team that will fight for your rights.