Thinking, Fast and Slow for Lawyers (Part 2)

9 min read

In Part 1, we discovered how our brains are, well… lazy. This time, let’s delve into the power of storytelling—and how this can mislead us.

History is littered with examples where storytelling has governed human thought. Our minds love juicy stories, whether the latest piece of gossip about someone you know or the latest shocking news story. There is a reason why internet clickbait and the 24-hour news cycle exist. They are, in part, designed to attract audiences who want stories that back up their existing system of belief.

I don’t want realism, I want magic!

On my recent travels to Slovenia, I visited Lake Bled—the setting for a picturesque castle perched atop a hill and a monastery in the middle of the lake. It felt like a version of heaven.

If this isn’t heaven, then I don’t know what is!

Slovenia is also a land of magical folktales. One of the restaurants at Lake Bled is Oštarija Babji Zob, a cozy dinner spot in the local town. “Babji Zob” is also known as “Hag’s Tooth” and describes a tooth-shaped peak visible from one of the windows.

Hopefully no teeth in the stew.

According to our waiter, a witch who lived in the mountains was furiously casting spells on a cold, thundery day. One of the spells went awry and she ended up throwing her tooth, which landed at the top of the mountain peak over a cave system now known as “Babji Zob”. It’s a delightful story in the vein of the Grimm Brothers. But upon checking out the restaurant’s website, one is slightly disappointed that there are other variants of this Hag Tooth story:

“According to the legend, an old witch lived on the Jelovica plateau, gathering herbs and casting spells across the valley. One day, while gathering herbs, she slipped and fell into the depths, from where her old spells could still be heard. And, echoed by the mountains, her curse could be heard all the way to the valley: Jelovica, you shan’t be as steep as before, let there be a rock at this place.”

Source: Oštarija Babji Zob

I like the first version more.

Final obligatory holiday snap.

It shows how we become attached to stories that seem to provide a more interesting message or an explanation of the natural or human world around us. As the tragic heroine, Blanche, in Tennessee Williams’ famous play A Streetcar Named Desire said:

“I don’t want realism. I want magic! Yes, yes, magic! I try to give that to people. I misrepresent things to them. I don’t tell the truth, I tell what ought to be the truth. And it that’s sinful, then let me be damned for it!”

Coherence = We like a nice, believable story

Let’s jump right back to TFS.

A significant aspect of legal practice is about presenting a good, coherent narrative. No lawyer would appear on Day 1 of a trial without a suitable narrative about why their client is right. It’s a version of a story that lawyers hope the judge will accept.

TFS is blunt about the realities behind the “power of storytelling”:

“… we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify.”

In short, humans like a good story that they can believe. This has implications for jury trials and how you go about persuading judges.

Now, roll right along to check out some of our flaws—and yes, they apply to us all.

The halo effect

Another descriptor for the “halo effect” is “exaggerated emotional coherence”. TFS notes that it “describes our tendency to like (or dislike) everything about a person—including things that you have not observed”.

One example is meeting a woman named Joan for the first time at a party. You find that she is personable and easy to talk to. If you were later asked if she was also charitable and generous, you would likely tend to say, “yes, of course, she would be”. Your assumption that Joan has a string of other nice character traits is of course, unsubstantiated. Your brain took a shortcut to assume that a “good person” must be a good egg overall.

Conversely, Kahneman notes that most people would struggle to accept this statement: “Adolf Hitler loved children”.

In the world of perceptions, the halo effect is a currency that people wittingly (if you’ve read TFS) or unwittingly trade on. Lawyers know this well. One of the most powerful and controversial tools of cross-examination is to cast doubt on the character or judgment of a witness. Once a judge or jury accepts this suggestion, everything that person does or says is coloured by that perspective.

Hearing one side of the story

Remember WYSIATI? I covered this in Part 1. In short, it stands for “what you see is all there is” and explains how humans tend to focus on what they can see while ignoring other possibilities.

WYSIATI makes a return in a research study about what happens when people hear one-sided evidence in legal cases—and know it.

In one study, all participants read the same background material. They were then split into groups where they either:

  • heard presentations given by the lawyers for both parties; or
  • heard the presentations given by the lawyers for only one of the parties.

Kahneman noted that “the lawyers added no useful information that you could not infer from the background story” (a possibly subtle and amusing comment about lawyers). In any event, the point was that the participants already had all the factual material available to them. They could, by themselves, have come up with a reasonable explanation for why each party could have had justice on their side. But they didn’t.

The researchers found that participants who heard one-sided evidence (and knew it) were more confident about their judgments than participants who had seen both sides. Kahneman explained that it was:

“… just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness.”

This is why judges are extra cautious about ex parte applications (“ex parte” is Latin for “on or from only one side”). It often comes up in insolvency proceedings where the plaintiff liquidator seeks a court order against the defendant in their absence. There is a heavy onus on lawyers to an ex parte proceeding to be completely frank with the presiding judge. This includes giving the other party’s side of the story. See, for example, rule 19.4 of the Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 (NSW):

19.4    A solicitor seeking any interlocutory relief in an ex parte application must disclose to the court all factual or legal matters which:

19.4.1    are within the solicitor’s knowledge;

19.4.2    are not protected by legal professional privilege; and

19.4.3    the solicitor has reasonable grounds to believe would support an argument against granting the relief or limiting its terms adversely to the client.

[Emphasis added]

This is an example which attempts to counter the effect of the one-sided evidence that TFS talks about.

The law of small numbers

The power of extrapolation is part of our human love for developing a coherent, believable story. It crops up in many different contexts, like: “I saw that happen once—it must happen a lot”.

TFS states that:

“The strong bias toward believing that small samples closely resemble the population from which they are drawn is also part of a larger story: we are prone to exaggerate the consistency and coherence of what we see.”

The problem? We prefer a good, believable story over actual statistics. It’s called “ignoring the base case”. TFS refers to a study in a legal context:

“A cab was involved in a hit-and-run accident at night.

Two cab companies, the Green and the Blue, operate in the city.

You are given the following data:

>> 85% of the cabs in the city are Green and 15% are Blue

>> A witness identified the cab as Blue. The court tested the reliability of the witness under the circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colours 80% of the time and failed 20% of the time.

What is the probability that the cab involved in the accident was Blue rather than Green?”

Considering probability in this situation requires us to consider both:

  • the base rate (i.e. the % of Green cabs and Blue cabs driving around at any one time);
  • the reliability of the witness (i.e. the likelihood that the witness is right or wrong).

Most people give the intuitive answer—where they ignore the base rate and go with the witness. The most common answer was 80%. (The correct answer is 41%.)

TFS then moves on to present a second version, where this time we are instead told that:

“You are given the following data:

>> The two companies operate the same number of cabs, but Green cabs are involved in 85% of accidents.

>> The information about the witness is as in the previous version.”

This time, our response is vastly different and closer to the correct answer of 41%. Kahneman humorously notes that:

“In the first version, the base rate of Blue cabs is a statistical fact about the cabs in the city. A mind that is hungry for causal stories finds nothing to chew on: How does the number of Green and Blue cabs in the city cause this cab driver to hit and run?

In the second version, in contrast, the drivers of Green cabs cause more than 5 times as many accidents as the Blue cabs do. The conclusion is immediate: the Green drivers must be a collection of reckless madmen! You have now formed a stereotype of Green recklessness, which you apply to unknown individual drivers in the company. The stereotype is easily fitted into a causal story, because recklessness is a causally relevant fact about individual cab drivers.”

We can deal with this problem of logic by remembering to refer to base rates. Hearing about something in the news or in one case doesn’t mean that it is prevalent or true across the board.

Hindsight bias

This is a trap that litigators fall into all the time. It’s always easier looking back at something that went wrong and to say, “I knew that was going to happen!”.

This creates particular problems for medical practitioners or regulators, who are never praised for non-events but roundly condemned when something goes wrong. High profile cases tend to involve stories about how an action or failure to act resulted in a loss of life.

While medical malpractice or lack of enforcement can be genuine issues—we should be wary about making up stories, especially about our ability to predict causal outcomes. TFS notes that:

“A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

Kahneman talks about studies that looked into what happens when a person changes their minds about a topic—for example, the death penalty. When a person was exposed to persuasive messages, their attitudes were usually closer to the persuasive message that they heard. Strangely enough, it was difficult for them to recall their former opinions on the topic. This is what happens:

“Asked to reconstruct their former beliefs, people retrieve their current ones instead—an instance of substitution—and many cannot believe that they ever felt differently.”

Shocking isn’t it?

This has implications on how experts might testify in court about the probability of events (or the reasonableness of certain conduct), especially when they have the benefit of hindsight.

Stay tuned for Part 3, the final instalment to, well… being human!

Image credit // Gaelle Marcel

Leave a Reply

Your email address will not be published. Required fields are marked *