Why do people make high-risk decisions even when they know the odds are against them?

0
1051

Psychological scientists have been interested in how people make decisions for several decades, but philosophers and economists have been studying decision making for centuries.

The most famous scholarly consideration of making a decision in cases when all the facts aren’t on hand is that of Blaise Pascal.

In 1670, in his Pensées, the French philosopher articulated what was, in his time, a pretty profound dilemma for rational people: to believe or not believe in the existence of God.

Pascal reasoned it out this way: If God exists, belief in Him will mean eternal salvation.

If He doesn’t exist, Pascal said, one loses nothing by believing. So the choice was clear: Believing is the safest bet.

And if you don’t believe, you should pretend to believe, because in so doing you might come around to genuine belief in time.

Pascal’s famous wager is the first formulation of what in the study of decisions came to be known as the theory of expected value:

When faced with a choice between uncertain alternatives, you should determine the positive or negative values of every possible outcome, along with each outcome’s probability, and then you should multiply the two and choose the option that produces the highest number.

It sounds simple, but choices in the real world are seldom that cut-and-dried. Expected value was given more nuance by Daniel Bernoulli in 1738 with his theory of expected utility.

Along with the values and probabilities of different uncertain outcomes, the Dutch-Swiss mathematician noted, there are two individual factors that would also be taken into account by any rational decision maker — his or her comfort with or aversion to risk, and the utility of a given payoff depending on his or her preferences or needs.

Value, in other words, isn’t an absolute. For example, a small monetary gain would be of greater utility to a poor person than to a rich person, and thus their decisions in a gamble could be entirely different but equally rational.

From Economics to Psychological Science

The prediction of social behavior significantly involves the way people make decisions about resources and wealth, so the science of decision making historically was the province of economists.

And the basic assumption of economists was always that, when it comes to money, people are essentially rational.

It was largely inconceivable that people would make decisions that go against their own interests.

Although successive refinements of expected-utility theory made room for individual differences in how probabilities were estimated, the on-the-surface irrational economic behavior of groups and individuals could always be forced to fit some rigid, rational calculation.

The problem is — and everything from fluctuations in the stock market to decisions between saving for retirement or purchasing a lottery ticket or a shirt on the sale rack shows it — people just aren’t rational. They systematically make choices that go against what an economist would predict or advocate.

Daniel Kahneman

Daniel Kahneman

Enter a pair of psychological scientists — Daniel Kahneman (currently a professor emeritus at Princeton) and Amos Tversky — who in the 1970s turned the economists’ rational theories on their heads.

Kahneman and Tversky’s research on heuristics and biases and their Nobel Prize winning contribution, prospect theory, poured real, irrational, only-human behavior into the calculations, enabling much more powerful prediction of how individuals really choose between risky options.

One keystone of prospect theory is loss aversion, or the discovery (based on numerous experiments reported in a classic article in the journal Econometrica) that winning $100 is only about half as appealing as losing $100 is unappealing.

The idea that the relationship between value and losses/gains is nonlinear — or put more simply, that “losses loom larger than gains” — is important for decisions involving risks, and opens the door for framing effects, in which the context and phrasing of a problem can influence a person’s choice.

Something as simple as whether a problem is phrased in terms of winning or losing can radically affect our decisions.

In one of their studies, Kahneman and Tversky presented two groups of participants with a choice involving hypothetical medical treatments for a deadly disease.

The first group was told that if a certain treatment was given to 600 people with the disease, 200 people’s lives would be saved; if they were given another, riskier treatment, there was a 1/3 chance that all 600 would be saved and a 2/3 chance of saving no one.

The second group was given the exact same choice, but it was framed in terms of lives lost instead of in terms of lives gained: The certain option meant 400 people would die for sure; the risky treatment meant a 1/3 chance no one would die and a 2/3 chance all 600 would die.

The majority of the first group chose the certain option: saving 200 people. The majority of the second group chose the risky option, gambling on the prevention of all the deaths even though it was only a 33% shot.2

In short, the terms you use to present a problem strongly affect how people will choose between options when there are risks involved. As shown in the medical treatment problem, people may seek a sure solution if a problem is phrased in terms of gains, but will accept risk when a problem is phrased in terms of (potentially) averting a loss.

Taking Shortcuts

Real-world problems are often complicated — it’s tough to think objectively about all the variables, and often enough we just don’t know what the odds of different outcomes are.

Our brains are naturally wired to reduce that complexity by using mental shortcuts called heuristics. Kahneman and Tversky and other researchers have identified numerous ways humans simplify decisions via heuristics and the biases such shortcut-thinking can produce.

One important heuristic is known as representativeness — the tendency to ignore statistics and focus instead on stereotypes. For example, Steve is described by a former neighbor as a helpful but shy and withdrawn soul who loves structure and detail and takes little interest in people or the real world.

When faced with a list of possible occupations that includes farmer, salesman, pilot, doctor, and librarian, people tend to predict Steve is a librarian because he fits a commonly held stereotype.

They ignore what ought to be an obvious fact — that there are many, many more farmers in the world than there are librarians. Ignoring base rates, as well as other statistical blind spots like not paying attention to sample sizes and simple misconceptions concerning chance, can lead to serious errors in judgment.

Fast and Slow

Those who study how people make decisions often draw a distinction between two types of mental processing used.

A fast, unconscious, often emotion-driven system that draws from personal experience is contrasted with a slower, more deliberative and analytical system that rationally balances benefits against costs among all available information.

The fast, gut-level way of deciding is thought to have evolved earlier and to be the system that relies most on heuristics.

It is this system that produces biases.

A 2004 study by Vassar biopsychologist Abigail A. Baird and Univ. of Waterloo cognitive psychologist Jonathan A. Fugelsang showed that this gist-based system matures later than do other systems.

People of different ages were asked to respond quickly to easy, risk-related questions such as “Is it a good idea to set your hair on fire?”, “Is it a good idea to drink Drano?”, and “Is it a good idea to swim with sharks?”

They found that young people took about a sixth of a second longer than adults to arrive at the obvious answers (it’s “no” in all three cases, in case you were having trouble deciding).1

The fact that our gist-processing centers don’t fully mature until the 20s in most people may help explain the poor, risky choices younger, less experienced decision makers commonly make.

Adolescents decide to drive fast, have unprotected sex, use drugs, drink, or smoke not simply on impulse but also because their young brains get bogged down in calculating odds.

Youth are bombarded by warning statistics intended to set them straight, yet risks of undesirable outcomes from risky activities remain objectively small — smaller than teens may have initially estimated, even — and this may actually encourage young people to take those risks rather than avoid them.

Adults, in contrast, make their choices more like expert doctors: going with their guts and making an immediate black/white judgment.

They just say no to risky activities because, however objectively unlikely the risks are, there’s too much at stake to warrant even considering them.

New study…

Now ….Picture yourself at a Las Vegas poker table, holding a bad hand—one with a very low chance of winning.

Even so, the sight of the large stack of chips that piled up during a recent lucky streak nudges you to place a large bet anyway.

Why do people make high-risk decisions—not only in casinos, but also in other aspects of their lives—even when they know the odds are stacked against them?

A team led by a Johns Hopkins biomedical engineer has found that the decision to “up the ante” even in the face of long odds is the result of an internal bias that adds up over time and involves a “push-pull” dynamic between the brain’s two hemispheres.

Whether you are suffering from a losing streak or riding a wave of wins, your cumulative feelings from each preceding hand all contribute to this nudge factor, they say.

A paper on the study is to be published online the week of Jan. 7 by the journal Proceedings of the National Academy of Sciences.

Insights from the research have the potential to shed light on how soldiers in high-risk combat situations make decisions and to facilitate more effective brain training to change or “rewire” long-term behavior or habits, the researchers suggest.

“What we learned is that there is a bias that develops over time that may make people view risk differently,” said senior author Sridevi Sarma, a biomedical engineering professor at the Johns Hopkins University Whiting School of Engineering and member of its Institute for Computational Medicine.

Pierre Sacré, a postdoctoral fellow at Johns Hopkins, co-led the study.

Sarma’s group sought to understand why people tend to take risks even when the odds are against them or avoid risk even when the odds are favorable.

They also wanted to learn where in the human brain such behavior originates.

They asked patients at the Cleveland Clinic’s Epilepsy Monitoring Unit to play a simple card game involving risk taking.

The patients had undergone stereoelectroencephalography, a procedure in which doctors implanted multiple deep-seated electrodes in their brains; that was designed to allow the doctors to locate the source of seizures for future surgical treatment.

Each of these depth electrodes has 10 to 16 channels that record voltage signals from the neurons surrounding it.

The electrodes also allowed Sarma and her team an intimate look at the patients’ brains in real time, as they made decisions while gambling against a computer in a card game.

The game was simple: The computer had an infinite deck of cards with only five different values—2, 4, 6, 8, and 10—each of which was equally likely to be dealt. Following every round, the cards went back into the deck, leaving odds unchanged.

Participants were shown two cards on a computer screen, one faceup and the other facedown.

(The faceup card was the player’s, and the facedown card was the computer’s.)

Participants were asked to bet low ($5) or high ($20) that their card had a higher value than the computer’s facedown one.

When dealt a 2, 4, 8, or 10, participants bet quickly and instinctively, the research team found. When dealt a 6, however, they wavered and were nudged into betting higher or lower depending on their bias—even though the chances of picking a higher or lower card were the same as before.

In other words, participants’ betting behavior was based on how they fared on past bets even though those results had no bearing on the outcome of the new bets.

On examining neural signals recorded during all four stages of the game, Sarma’s team found a predominance of high-frequency gamma brain waves.

They were even able to localize these signals to particular structures in the brain. It turns out that these regions—excluding any implicated in drug-resistant epilepsy—were associated positively or negatively with risk-taking behavior.

“When your right brain has high-frequency activity and you get a gamble, you’re pushed to take more of a risk,” said Sacré, who expressed surprise at the symmetry of the patients’ brain reactions under these conditions.

“But if the left side has high-frequency activity, it’s pulling you away from taking a risk. We call this a push-pull system.”

To assess that internal bias, the researchers developed a mathematical equation that successfully calculated each patient’s bias using only their past wagers.

“We found that if you actually solve for what this looks like over time, the players are accumulating all the past card values and all the past outcomes, but with a fading memory,” Sarma says.

“In other words, what happened most recently weighs on a person more than older events do. This means that based on the history of a participant’s bets, we can predict how that person is feeling as they gamble.”

More information: Pierre Sacré el al., “Risk-taking bias in human decision-making is encoded via a right–left brain push–pull system,” PNAS (2018). www.pnas.org/cgi/doi/10.1073/pnas.1811259115

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.