case-law-677940_1920

Cognitive Biases in Legal Decisions

Legal judgments are supposed to be impartial. A judge’s decision should be based on facts and an unbiased take on the case at hand. Social psychological research, however, has shown again and again how hard it is for people to make unbiased decisions. Are legal professionals better able to control for these biases?

Not necessarily.

Recently I’ve been reading Adam Benforado’s great book, Unfair: The New Science of Criminal Injustice. In it, he covers the research in psychology that makes us question the ability for legal decisions to be truly “unbiased.” Eyewitness identifications can be faulty, people can confess to crimes they didn’t commit, and people often blindly accept expert testimony.

In one chapter, Benforado reviews the research on judges’ capability to remain impartial. Of course, judges want to remain impartial and probably think they’re being objective. Even so, subtle (even “irrational”) biases can easily creep into their decisions.

One case of these biases pertains to my recent post on the anchoring heuristic. To review, when we’re making a numerical estimate, we’re often biased by the number we start at. So let’s say I need to estimate the distance from the Earth to the moon.[1]When would I ever do that? Maybe I’m having a conversation with my astronomer cousin and I feel a stinging urge to look like I know what I’m talking about. Don’t judge me. If I start at “1 mile,” I can be pretty sure the answer is bigger than that, so I end up making a bigger guess, but I’m likely to avoid going quite big enough to be right. If I instead start at “3 billion miles,” I can be pretty sure it’s less than that, but I’m likely to still end up overshooting my estimate because I started so high. I’m not accurate in either case, but I’d be wrong in a biased way.[2]I know. You’re dying to know. The correct answer is 238,900 miles.

The Anchoring Heuristic and Judicial Decisions

A lot of the research on anchoring has focused on estimates that don’t matter too much. Okay, so I’m biased in guessing the distance to the moon, the height of Mt. Everest, and the cost of a CD.[3]If you’re confused, check out the post devoted to the anchoring heuristic. Spoiler alert: all of these judgments have been shown to be bias-able through anchoring So what?

hammer-719066_1920Can important numerical judgments be biased by a random anchor?

Consider one pretty influential judgment: a judge’s decision about someone’s prison sentence. Should he serve 3 months? 9 months? 2 years? I think we can agree this is one numerical judgment we should care about, and even this judgment can be biased by the anchoring heuristic.

In one study, researchers created a fictional legal case involving an alleged rape, complete with all the details a judge would need to make a decision. They gave these materials to a bunch of legal professionals[4]A mix of judges and prosecutors. and gave them time to read everything.

To introduce an random anchor, the legal professionals were asked to imagine that in the course of considering this case, they get a call from a journalist, who asks for a comment on the expected prison sentence. Sometimes this “journalist” asks whether the judge thinks the sentence would be greater than or less than 1 year. Other times, the “journalist” asks whether it will be greater than or less than 3 years.

Everyone was told to disregard the journalist in order to remain unbiased, but when they finally rendered their sentencing decision, they were clearly biased by the journalist’s pesky question.

People who saw the “1 year” version of the journalist’s call ended up deciding on prison sentences that averaged about 25 months. Those who saw the “3 year” version, however, ended up deciding on prison sentences that averaged about 33 months. So a simple suggestion from a journalist (who was promptly and correctly ignored consciously) made an 8-month difference in sentencing decisions.

The Influence of Clearly Arbitrary Anchors

You could argue that in the previous study, the judge would have taken the information from the journalist as a reasonable starting point, reflecting popular opinion. But what about when the starting point is undeniably random?

justitia-421805_640In a follow-up study, the researchers created a new legal case about shoplifting and recruited some more legal professionals to read the information and make some decisions. This time, the anchor was described as the prosecutor’s sentencing demand. That is the prosecutor would either call for a relatively high or low sentence, which the judge should obviously ignore in order to assess the facts objectively.

In this study, though, the judges determined the prosecutor’s demand by rolling dice. There’s no denying that the demand is completely random! [5]As a reminder, this isn’t a real case but a fictional case that everyone knew was designed for research. So it wasn’t that weird for judges to decide parts of the case by rolling dice.

Well, let’s back up a second. The judges thought the prosecutor’s demand was random, but in reality, the researchers used loaded dice. Depending on the experimental condition, the judge would either roll a 3 or a 9. In other words, half of the people in the study ended up reading a version of the case where the prosecutor called for a 3-month sentence and the other half ended up thinking the demand was 9 months.

Even this random, arbitrary number biased the judges’ sentencing decisions. When the dice turned up a 3-month sentencing demand, judges ended up giving a 5-month sentence, on average, but when the dice turned up a 9-month sentencing demand, they ended up giving a 9-month sentence on average. Once again, this should have had no impact on the decision, but it’s clear that it did!

How Concerned Should We Be?

These studies show that the same heuristics and biases that affect our everyday judgments can even creep into the legal judgments of criminal cases. Notably, these studies showed that the biases were just as strong for legal professionals who were experts in criminal law as for those who were experts in other fields.

Realize, though, that the decisions in these studies were hypothetical decisions and not ones that had real consequences. It’s not clear whether the same biases would come into play in real legal decisions. The study participants, however, were real legal professionals and were quite confident in the decisions they made. If they’re not aware of the biasing impact of these irrelevant numbers, it’s not clear that they would do anything to counteract them when the decisions become more impactful.

In fact, these results mirror other research showing the ability for numbers to bias other legal decisions. For instance, one study showed that plaintiffs who request higher awards end up receiving higher awards. Of course, the real award should be decided based on the facts, not on the request. Nevertheless, just putting a number out there may subtly nudge final decisions in a particular direction.

Anchoring is just one example of common cognitive biases that impact legal decision-making. For more, I highly recommend Benforado’s book, and hopefully I’ll cover some more of that research on the blog in the future.

Footnotes   [ + ]

1. When would I ever do that? Maybe I’m having a conversation with my astronomer cousin and I feel a stinging urge to look like I know what I’m talking about. Don’t judge me.
2. I know. You’re dying to know. The correct answer is 238,900 miles.
3. If you’re confused, check out the post devoted to the anchoring heuristic. Spoiler alert: all of these judgments have been shown to be bias-able through anchoring
4. A mix of judges and prosecutors.
5. As a reminder, this isn’t a real case but a fictional case that everyone knew was designed for research. So it wasn’t that weird for judges to decide parts of the case by rolling dice.

One thought on “Cognitive Biases in Legal Decisions

Leave a Comment