Channels
Log in register
piqd uses cookies and other analytical tools to offer this service and to enhance your user experience.

Your podcast discovery platform

Curious minds select the most fascinating podcasts from around the world. Discover hand-piqd audio recommendations on your favorite topics.

You are currently in channel:

Technology and society

Jack Chuter
Writer/Podcaster

Co-host of the Episode Party podcast, author of Storm Static Sleep: A Pathway Through Post-rock, editor at ATTN:Magazine.

View piqer profile
piqer: Jack Chuter
Wednesday, 06 February 2019

The Injustice Embedded Within Predictive Policing

It's an explosive opening. At the LAPD Board of Commissioners meeting, a charitable donation is approved to transform a conference room into a Community Safety Operations Center (CSOC). Shouts of "shame on you!” start to drown out the proceedings, and the commission president threatens to terminate the meeting if the unrest continues. As is so often the case with Hi-Phi Nation, the significance of the event is not immediately apparent. Yet by the end of this 40-minute investigative piece – which combines first-hand interviews with thorough philosophical deliberation – this moment is reframed within an important debate about predictive policing, bias and the façade of data neutrality.

CSOCs form part of technological upgrades currently taking place within the LAPD and elsewhere, using statistical science to anticipate crime before it happens. The studies behind these methods are all published within peer-reviewed academic journals, and the operation boasts that its algorithms are utterly devoid of any prejudice on the basis of, say, race and ethnicity – unlike policing efforts that rely more heavily on human intuition.

Of course, it doesn’t take long to dispel this claim to neutrality. As host Barry Lam explains, these methods create feedback loops of re-enforcement. Heavily-policed areas become places where most criminality is observed, which only asserts their status as crime hotspots. The show takes a trademark philosophical turn in the second half, highlighting a crucial paradox within our reliance on algorithms: the more accurate our statistical method for identifying future criminals, the more immoral it becomes to use it. A process that spots criminals with 90% accuracy will lead to 10% being unjustly treated. An accuracy of 95% only increases our dependency on the method, therefore intensifying the harassment of an innocent 5%. As it transpires, the algorithms that purport to eradicate all trace of human bias only work to amplify it.

The Injustice Embedded Within Predictive Policing
5
0 votes
relevant?

Would you like to comment? Then register now for free!