How much do you depend on trust?
The central theme in our Think:Act magazine is trust.
by Gerd Gigerenzer
in conversation with Bennett Voyles
Psychologist Gerd Gigerenzer has some radical views on risk, error culture and getting back to the basics of decision-making. Here he offers a simple way to tackle complexity: trust your instincts.
After decades of studying how people think about risk, I’ve learned a lot about why people make good and bad decisions. But unlike Daniel Kahneman, Richard Thaler and the behavioral economists who have covered some of this same ground and concluded that human beings are doomed to keep repeating their mistakes, I’ve reached a more positive conclusion.
The truth is that left to our own devices, we actually make a lot of good choices. Where we go wrong is typically either because of social pressure or simple risk illiteracy – and both those conditions can be cured.
Most sub-optimal decisions begin with the error culture of our institutions. In business, for instance, most corporations I know have a negative error culture: They believe errors should never occur; if they occur, they should be hidden under the carpet; and if they can’t be hidden, someone must be blamed.
This leads to defensive decision-making. You might believe that Option A would be the best for the company, but instead you’ll look for a safer option – not the one that’s the best for the company, but the one that protects you from being blamed or fired. This seems to be very common: I have analyzed several large international companies and found that about every second to third decision is defensive – and that’s by self-report, so probably the figures are higher.
Companies with positive error cultures tend to make fewer mistakes in the end because they have a system where the many can learn from the few. Thanks to their positive error culture, airlines, for instance, now have next to no crashes. You only need to drive 12 miles to equal the risk of 1,000 miles on a commercial jet.
Contrast that to the negative error culture that still prevails in the medical profession, where mistakes are never permissible and generally dangerous to acknowledge. In Germany alone, an estimated 19,000 patients are killed by preventable errors every year; in the US, the number reported exceeds 100,000.
It’s a shocking statistic. A student of mine, a captain of Lufthansa, recently wrote a dissertation that compared error cultures in commercial aviation and in hospitals. When we had our first meeting with the head of a clinic and the head of Lufthansa safety, the Lufthansa safety officer told the clinic head: “If we had an error culture like yours, we would crash a plane every day.”
How do you build a positive error culture? Taking away the sense of shame is the key. In a positive error culture, the key question is not who is guilty, but how we can change the system that it’s less likely to happen again.
The airlines do it by working with checklists to reduce the chances of missing a step. They also have a critical incident reporting system that enables a pilot to write down near misses so they can add measures that further reduce the risk.
Good role models at the top also help a lot. If you’re a top executive who was involved in a decision that didn’t work out, the way to begin to change your culture is to discuss it with your team and say: “Look, I was part of that decision. We know it was a bad decision. Let’s think about how we got there.”
There are a lot of other things you can do as well to reassure people. At one company we worked with, managers issued Monopoly-style “Get Out of Jail Free” cards with the explicit instruction that if you take risks for your company and you’ve failed, just turn the card in and there will be no questions asked. That changed the entire culture, because now, if you haven’t turned in your card for three years, there is a big question mark over you. Playing safe is no longer safe.
But organizations can be very resistant to such deep-seated cultural change. Even when lives are on the line, people may not budge: US hospital executives have known for more than 10 years now that two thirds of the estimated 29,000 people a year who die of catheter infections could be saved by following a simple five-point hygiene checklist, yet today, 10 years later, only a few hospitals have adopted checklists.
Another key to better decision-making – either at the organizational or the individual level – is to focus on heuristics. Much of the information we get is noise, and noise can be disorienting. Often, a simple rule of thumb can reduce the complexity of the choice and lead to a better decision. In the last financial crisis, for instance, the complex calculations of the rating agencies were not the solution, but rather part of the problem.
Understanding the numbers you read can also lead to better decisions. Risk illiteracy is endemic in our society, and not just among the general public: In our studies, we find that 70-80% of doctors do not understand health statistics. It’s not that they are stupid, they just have no training. Most medical schools don’t teach thinking: They mainly teach learning by heart. As a result, many doctors don’t know the scientific evidence in their field and can be steered – or nudged – in any direction.
Director at the Max Planck Institute for Human Development and at the Harding Center for Risk Literacy in Berlin, Gerd Gigerenzer is also the author of "Simply Rational: Decision Making in the Real World" and "Risk Savvy: How to Make Good Decisions".
The World Health Organization has recently warned us, for example, that for every 50 grams of sausage or processed meat we eat on a daily basis, our risk of getting colon cancer increases by 18%. So you might think, no sausage, because out of 100 people, 18% get colon cancer. No – WHO’s number is a relative risk: Eating sausage raises the chance of getting colon cancer – not dying from it, but getting it – from 5% to a little less than 6%.
But attitudes are changing. People in many fields are now looking for clearer information and simple heuristics that make it easier to act on that information. For instance, my team is working with Andy Haldane, the chief economist of the Bank of England, to test simple heuristics – fast and frugal three-question decision trees to estimate bank volatility and other solvency factors, because we find that just a few measures can often do a better job assessing risk than a 50-factor algorithm.
Finally, we encourage people to trust their gut instincts more. Pursuing heuristics and intuition at once might sound contradictory, but studies show that people who have many years of experience with a certain type of problem can trust their gut more than people new to the field. Intuition is a form of unconscious intelligence that lets them feel something faster than they can explain it.
The 17th century gave us a probabilistic revolution. But more of the same can’t help us now, not in the many uncertain situations we face. We need a heuristics revolution – in health, in wealth and in risk literacy, or else we will not be able to sustain an informed citizenship in a functional democracy. We want to use algorithms, not be used by algorithms.
The central theme in our Think:Act magazine is trust.
Curious about the contents of our newest Think:Act magazine? Receive your very own copy by signing up now! Subscribe here to receive our Think:Act magazine and the latest news from Roland Berger.