Justice, Big Data, and AI

Dalton Yoder
2 min readMar 22, 2021

In this chapter of Cathy O’Neil’s book, Weapons of Math Destruction, she discusses big data that police stations across the country use for optimization of their patrols. This data helps to determine where crime is most likely going to happen, so that the officers won’t waste time somewhere they might not be needed. On a surface level, and from an initial standpoint, this is alright. This helps with optimizing local law enforcement, and many smaller precincts across the country have used this to help stop budget cuts. However, there is a darker side to this.

This darker side is how this can affect minorities in communities where these big data programs are ran. For example, if the program is widened to more “smaller” crimes, we can see that certain communities are targeted more, which would lead to a loop of more officers patrolling that area. With this, the algorithm is being more biased towards minority communities, which is something we do not want from a justice standpoint.

To prevent this from occurring, an idea of mine is to more narrowly define your search parameters for the program, maybe limiting what crimes count towards your patrol protocols. This could help open up the communities, while still allowing patrols to occur.

For me, the biggest takeaway from this chapter is the fact that crimes that should be recorded as top tier crime, are not. With that, the crimes that are reported are skewed, so that communities that need to grow are hurt by these statistics. What we need to do is to change how these crimes are weighed in the system, and this can lead, in my honest opinion, to a greater age of data. One where there isn’t a bias, and where justice can reign supreme. Hopefully, this will happen in the near future, and we can all reap the benefits of a safer and just community.

--

--