In the absence of pre-cognitive superbeings and Tom Cruise, how are police and policy makers supposed to allocate scarce crime-fighting resources? There is a vibrant academic literature on predicting crime, with models of various types offered as the best way of estimating future crime rates. Many of these involve mapping software, which plots the past in the hopes of extrapolating to the future. Police use some of these techniques, but most are very crude, using things like weather or the location of liquor stores as "hot spots" to estimate crime rates. Police also use experience and gut instinct. All of the various methods, whether formal models or inside the head of the commissioner of police, are deployed in haphazard and isolated ways. In this lecture, recorded May 13, 2008 as part of the Chicago's Best Ideas lecture Assistant Professor of Law M. Todd Henderson presents an alternative.
Video of the talk is embedded below, and a .mov file and .mp3 file are also available. Prof. Henderson's paper on this topic (written with Justin Wolfers and Eric Zitzewitz) is available from SSRN.
Comments
You can follow this conversation by subscribing to the comment feed for this post.