Thinking, Fast and Slow for Lawyers (Part 1)

10 mins

Lawyers often believe that they exist on a higher cerebral plane than other humans. Thinking, Fast and Slow by Daniel Kahneman puts all our fanciful thinking into the garbage can. Here’s a quick summary of what lawyers can learn from his book.

Remember this monkey?

It has its own associated Wikipedia page, called the “Monkey selfie copyright dispute”. Possibly one of the few instances when cute quickly became… court litigation.

Our central hero, a macaque monkey in Indonesia took a few happy snaps of itself with a camera belonging to photographer David Slater. In a rather bizarre confluence of litigation, involving Slater and later the organisation PETA, the question to be answered was—who owned the copyright to the photographs?

Without going into extensive detail, the 9th U.S. Circuit Court of Appeals in San Francisco concluded that: “… this monkey—and all animals, since they are not human—lacks statutory standing under the Copyright Act”. In other words, animals can’t be legally entitled to copyright at least according to a US court.

It’s odd cases like these which make the news—and point to how laws are on the whole, largely made with the human in mind. We tend to place ourselves in a higher position compared to other animals, which makes one think—are we really the paragons of reason we perceive ourselves to be?

One book puts all that into doubt.

No monkeying around these humans?

Not too long ago, I wrote about why it takes so much longer to complete a task than we think it should, or why we always blow our budgets. The culprit? A perception bias of humans, the “planning fallacy”, which Daniel Kahneman covers in his book Thinking, Fast and Slow or TFS.

Reading TFS is like getting run over by a bus (not by experience, but by analogy only). It reveals lots of uncomfortable truths from cognitive and social psychology about what it means to think like a human. We also get to see through the eyes of a Nobel Prize winner (Kahneman) and yes, it’s a hearty book. You really need to have your wits about you.

As a lawyer (and yes we’re human though some don’t sleep), you do question yourself after absorbing TFS. This is because you realise how difficult it is to fight these biases which are, on the whole, hardwired into our brains.

Unlike the million and one self-help books out there, TFS is a book that makes more observations than provide easy solutions. It can be frustrating to find out that there aren’t terribly many solutions to dealing with some of our in-built biases. TFS also reveals truths about the human brain—which can, like many things, be used for good or evil.

I promised to write a summary of TFS for lawyers.

So with much (or little) anticipation, here’s Part 1.

Basic concepts

Before we kick things off, you need to know these two concepts:

  • System 1 and System 2 thinking
  • WYSIATI = “what you see is all there is”.

The concept of “System 1” and “System 2” thinking describes how our brain interprets the world. Elsewhere, you might have seen the concept of the elephant and its rider.

System 1 (the elephant) is fast, intuitive thinking. It provides immediate fight-or-flight reactions and likes jumping to conclusions. One example is our first impressions of people—and whether to trust them or not. In other words, it’s thinking with our “gut”.

System 2 (the rider) does the slow, deliberative thinking. It monitors System 1 and attempts to remain in control as best as it can within its limited resources. In other words, System 2 depletes quickly when we’re under a lot of mental strain, like if we overwork or don’t get enough sleep.

WYSIATI is an aspect of System 1 and appears a great deal in TFS. It explains how humans tend to focus on what they can see, while ignoring other possibilities. The best way of explaining WYSIATI is the saying “out of sight, out of mind”. It causes all sorts of issues, including how we tend to neglect ambiguity and suppress doubt to create a believable story about our world.

System 1 is not all bad, of course. It helps us navigate the world with minimal use of brainpower. It’s also often correct. On the other hand, System 2 thinking is terribly resource-hungry. Can you imagine how overwhelmed you’d be if a simple act of walking down a busy street involved processing all sights and sounds? Or have to chart your route hours beforehand so that you don’t run into people? Thankfully, we tend to do this without too much effort.

Crossing the busiest intersection in Shibuya – Tokyo, Japan = System 2 processor fail.

Can you math?

Let’s test the waters with a little quiz.

Because System 2 thinking is so difficult, we tend to fall back on System 1 thinking when we’re tired or simply feeling lazy. And it doesn’t matter if you’re cerebrally well-endowed.

In one experiment, thousands of university students were asked this question which you should also try to answer:

A bat and ball cost $1.10.
The bat costs one dollar more than the ball.
How much does the ball cost?

More than 50% of students at Harvard, MIT and Princeton gave the intuitive System 1 answer: 10¢

The correct answer? 5¢

(Lawyers, rejoice. We’re not the only ones terrible at math.)

Cognitive ease (aka. our lazy brain)

By now, you’ve figured that we have a lazy brain. To make life slightly more complicated, TFS tells us about an experiment about how our brain chooses to believe or “unbelieve” statements. The theory goes like this:

“That understanding a statement must begin with an attempt to believe it.”

The initial attempt to believe is a function of System 1.

System 2 then kicks in to work out whether it is reasonable to believe that the statement is true. If not, then it works to “unbelieve” the statement.

In the experiment, researchers gave participants a series of statements, including nonsensical ones like “a dinca is a flame”. Then, they asked the participants if they thought the statements were true or false. In one part of the experiment, the researchers disrupted participants by requiring that they hold digits in their memory during the task. The effect of the disruption was that participants ended up thinking that many of the false statements were true—System 2 (under cognitive load) simply didn’t have the capacity to unbelieve the statements.

Kahneman summarises it best:

“… when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy… there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.”

Our lazy brain, in other words, is highly suggestive. TFS is packed with examples of this.

Size 12 font, please!

(Oh, can I use Wingdings?)

If you operate around the courts, you know that judges HATE small font. Like everyone, they know that you play around with the margins to fit more words within the page limit. Cheeky, cheeky! But small or irregular fonts are a real pet peeve.

TFS tells us that the rules of persuasion, beyond a truthful message (which is what you should always be opting for), is about making your reader as cognitively comfortable as possible. Based on experiments, we can say that you’re going to be more convincing if you:

  • use high-quality paper to maximise the contrast between the characters and their background;
  • use bright blue or red text (not washed out shades of green, yellow pale blue);
  • use simple language (yay) over complex language (nay);
  • make it memorable.

Of course, there is a limit to the last point—the concept is well-understood by advertisers—while lawyers, well, we can’t exactly magic things out of nothing. The other tips are otherwise pretty helpful.

Choose vivid over abstract (sometimes)

Lawyers with jury trials, pay attention.

TFS tells us that our brain latches on to vivid detail, rather than boring statistics that feel abstract to us. The same goes for how juries decide on likelihood. Again TFS summaries it better:

“A good attorney who wishes to cast doubt on DNA evidence will not tell the jury that ‘the chance of a false match is 0.1%.’ The statement that ‘a false match occurs in 1 of 1,000 capital cases’ is far more likely to pass the threshold of reasonable doubt. The jurors hearing those words are invited to generate the image of the man who sits before them in the courtroom being wrongly convicted because of flawed DNA evidence. The prosecutor, of course, will favor the more abstract frame—hoping to fill the jurors’ minds with decimal points.”

Beware the priming effect

One aspect of System 1 is how it can be influenced by an association of ideas. In other words, depending on what we have at the forefront of our minds, we can be influenced into answering a question in a certain way. WYSIATI!

A simple example is this:

“If you have recently seen or heard the word EAT, you are temporarily more likely to complete the word fragment SO_P as SOUP than as SOAP. The opposite would happen, of course, if you had just seen WASH.”

There are other weirder examples. In one experiment, young people who had to unscramble sentences containing words associated with the elderly like “forgetful”, “bald”, “gray” or “wrinkle”—afterwards walked much slower down the hallway than others.

How odd!

In the legal world, there are simple measures to stop the priming effect, like not being able to ask leading questions during cross-examination. However, given how subtle the priming effect can be, we should be thinking a lot more about how we order our questions (or what we ask) can dramatically change the answer to the next question.

Are you answering the right question?

When we’re faced with a difficult question, TFS tells us that we need to be aware of “substitution”—in other words, substituting a difficult question with an easier question.

In a brilliant experiment, a group of German students were asked two questions:

  • How happy are you these days?
  • How many dates did you have last month?

Asked in this order, researchers didn’t find a correlation between increased happiness and going on more dates.

Reverse the order and … voila, a correlation appeared. When students were first asked to think about their dating life, this elicited an emotional response—which in turn framed the way they thought about their happiness. Of course, in a global sense, happiness is far broader than the measurable number of dates that one embarked on in the past month. However, WYSIATI!

I don’t have to tell you that if you knew enough about a witness, you could elicit a different response depending on how you frame the previous questions.

Ahoy, watch out for anchors!

A close cousin of the priming effect is anchoring.

Anchoring is what your mind gets fixated on especially in the realm of unknown figures or quantities. TFS brings up the example of how the asking price for a house tends to place an anchor in participants’ minds to negotiate around that figure. Another example is asking whether Gandhi was more or less than 114 years old when he died. You would get a higher number in the answer compared to if you phrased it as “was he more or less than 35 years old?”.

This obviously has implications both in the world of litigation, as well as contract negotiations.

In litigation, anchoring can be deliberate. This is done by a judge asking for similar and prior cases on matters of criminal sentencing or damages for personal injury. We don’t think too much about these practices, but TFS brought up an example which gave me the heebie-jeebies. (Before you read on, I have to say that I have no idea how they managed to run this experiment, because I can’t see the same thing happening in this jurisdiction.)

“The power of random anchors has been demonstrated in some unsettling ways. German judges with an average of more than fifteen years of experience on the bench first read a description of a woman who had been caught shoplifting, then rolled a pair of dice that were loaded so every roll resulted in either a 3 or a 9. As soon as the dice came to a stop, the judges were asked whether they would sentence the woman to a term in prison greater or lesser, in months, than the number showing on the dice. Finally the judges were instructed to specify the exact prison sentence they would give to the shoplifter. On average, those who had rolled a 9 said they would sentence her to 8 months; those who rolled a 3 said they would sentence her to 5 months…”

Another example that TFS raises is how capping damages in personal injury cases can result in eliminating large awards, but put an anchor in the minds of practitioners—and “pull up the size of many awards that would other be much smaller”.

Here, TFS suggests some ways to combat the anchoring effect:

  • Have a strategy of deliberately “thinking the opposite” to avoid bringing up thoughts that directly reinforce the anchor.
  • When negotiating, “if you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear—to yourself as well as to the other side—that you will not continue the negotiation with that number on the table.”
  • Alternatively, you can also search your memory for arguments against the anchor.

I found this TFS quip particularly amusing:

“The defendant’s lawyers put in a frivolous [case] reference in which they mentioned a ridiculously low amount of damages, and they got the judge anchored on it!”

Scrambled brain?

Stay tuned for Part 2.

Image credit (main) // Mathew Schwartz

Image credit (crossing) // Finan Akbar