Professor Ariel Porat recently presented his paper (with Alon Harel), Aggregating Probabilities across Offenses in Criminal Law, at the Law and Economics Workshop. This is a forum where academic working papers are presented and discussed among interested faculty and students.
To be convicted for a criminal offense, it must be proved beyond a reasonable doubt that the defendant committed the offense. This currently remains true even when the defendant is charged with multiple offenses. He must be guilty beyond a reasonable doubt for each individual offense. As a result, some criminal defendants may remain unconvicted of any offense even though it is likely that the defendant committed each offense (but not beyond a reasonable doubt for any single offense), and almost certain that he committed at least one of the offenses (beyond a reasonable doubt).
Professor Ariel Porat argues that the probabilities for these individual offenses should be aggregated so that such defendants are convicted of some crime. The question should be whether it is beyond a reasonable doubt that the defendant committed an offense instead of whether it is beyond a reasonable doubt that a defendant committed a specific offense. This reformulation certainly would result in more criminals being convicted (increasing deterrance), but it would also increase the number of innocent people falsely convicted. The desirability of this approach hinges on minimizing the latter.
False convictions would result from overestimating how independent multiple offenses are. If two crimes were completely independent and there was a 90% chance that the defendant is guilty of either offense, then there is a 99% chance (1 - 0.1*0.1) that he committed at least one of the crimes. However, the two offenses could also be completely dependent. For example, a defendant charged of robbing and murdering a victim likely either committed both crimes or wasn't involved at all. Thus, in this example it could be that there is a 90% chance that he is guilty of either offense, but there is also a 10% chance that he committed neither. A blind application of aggregate probabilities would overestimate (99% instead of 90%) how likely it is that the defendant committed either crime. If we assume that a 95% chance is the "beyond a reasonable doubt" threshold, then it would be erroneous to convict this defendant. Recognizing this problem, Porat suggests courts not aggregate probabilities when there is a large risk of interdependence--aggregate probabilities would not apply to the robbery/murder case. How feasible this would be remains to be seen.
Even with its difficulties and potential errors associated with interdependence, Porat argues that openly aggregating probabilities may be better than the current system. It seems reasonable that some judges and jurors may already knowingly or unknowingly connect offenses (aggregate) with one another and be determined to convict a defendant of something. Likewise, it's probably true that there are other judges and jurors who go to extreme lengths to separate the offenses. This leads to erratic application of the law. Open use of aggregate probabilities in criminal cases could conceivably lower error costs at the same time deterrance is increased.
This is quite possibly the WORST idea for our criminal justice system since mandatory sentencing. How does this idea fulfill either retributivist ideals of punishing those that deserve punishment or utilitarian ideals (Bentham is likely rolling in his grave) concerning deterrence or rehabilitation? Our justice system is predicated on people having the benefit of the doubt. The Rule of Lenity, burden of proof/persuasion, as well as other fundamental ideas spring from the idea that we would, as a rational, reasonable society, rather release a guilty man than convict an innocent man (and deprive him of his rightful freedom). We need only look to what "our" Founding Fathers endured to see why this system exists. Aggregating crimes is such a repugnant and silly idea that I'm really ashamed to have attended the University of Chicago. We charge individuals with specific crimes simply because we wanted to ensure that our criminal justice system wouldn't be abused like this! The Constitution will need some scotch guard to protect itself from the likes of the neo-con droves that occupy the black ugliness on the Midway.
Posted by: Rishi Nair | November 06, 2008 at 10:14 PM
Aggregating possibilities, as the author stated, can lead to a decrease in arbitrariness while at the same time increasing deterrence and maintaining current conviction levels. I certainly agree with the statement that we, as a society, should rather a guilty man go free than an innocent man go to prison, however; if we attach a weight to this preference, we can still allow a simple Bayesian analysis to aggregate possibilities in order to increase deterrence and preserve personal freedom. Preserving personal freedoms should not necessarily grant all conditional probabilities of dependent crimes to the verdict of "innocent".
Posted by: Michael | November 15, 2008 at 04:45 PM
My sense is, that given the underlying commitment to criminal defendants' rights, aggregating possibilities is more likely to reduce criminal penalties than increase them.
If a person who is guilty just barely beyond a reasonable doubt (say 95% probability) of four unrelated crimes, then he may be convicted of only three, since the sub-reasonable doubts of four cases aggregate to a reasonable doubt in one (or even two). So the defendant might be penalized for only the middle two crimes, or the two least serious ones.
On the flip-side, where there is only a slightly reasonable doubt regarding each of the four crimes, we would have a tougher time aggregating in order to penalize the defendant, because there are Due Process, pseudo-Double Jeopardy, and other criminal law doctrines blocking our way. So it seems aggregation would only cut in the defendant's favor.
I thought this was an excellent theoretical work.
Posted by: Uzair Kayani | November 16, 2008 at 12:54 PM