Harvard Business Review
Stop Trusting Your Instincts
Flying on a clear day, a pilot can see everything out of the cockpit: the visual cues match the instrument readings so there is no ambiguity about what the plane is doing. But piloting at night is completely different: you are flying blind so you are entirely dependent on the instrument readings. In fact, you should rely exclusively on the instruments because your internal instincts will lead you astray. Without visual cues, our brains try to makes sense of position and direction by relying on a sense of balance created by mechanisms in our inner ears. But our inner ears can be easily fooled: in the darkness, a plane can feel as though it's ascending or descending when it's doing the opposite.
The inner sense of balance is such a strong instinct that a pilot will be tempted to disbelieve the instruments when they contradict gut feelings. Experienced pilots know to over-ride their instincts, but new pilots can succumb to the strong pull of their intuitions — sometimes with tragic consequences.
John F. Kennedy Jr. was a competent pilot with limited night-time flying experience when he departed for a short flight on a hazy Friday night in July 1999. Radar showed his plane ascending and descending repeatedly before it plunged into the ocean. Investigators determined that pilot error, as a result of spatial disorientation, was the probable cause of the crash. The tragic accident is a stark demonstration of what happens when we rely too heavily on our intuition in complex situations where our instinctual "gut feel" doesn't serve us well.
In the course of our daily lives, we humans are skilled day-time pilots. Our instincts help us navigate through myriad straightforward challenges. The problem is that our modern world is characterized by an increasing degree of interconnectedness among all its parts, giving rise to new-world complexity. And our intuitions are often poorly adapted to complex problem-solving. We are now flying at night, and, like JFK Jr., sometimes flying on hazy nights where the horizon is difficult to make out and where panicked passengers (colleagues, bosses, boards, the public-at-large) are demanding we fix things instantly.
The temptation in these difficult situations is to rely on our ill-equipped intuitions because we are not well-trained for the complex challenges that have exploded onto the scene in our recent evolutionary past. Intuition — a.k.a. "blinking" — derives its power from the expertise that comes from repeated practice, but we have not yet developed proficiency in managing complexity, which is why our intuitions about complex challenges are often deceptive.
Of the many ways our intuitions are mismatched with complexity, the most profound is the basic cause-effect model that underpins all of our thinking: our brains are always automatically searching for simple, single causes to explain all manner of things. Our intuition that every effect has a single cause is a survival-enhancing mental model that is well-suited to a world of straightforward problems. But applying this simple explanatory model to complexity can be just as misleading as relying on your inner ear to tell you where you are. Complex problems are characterized by multiple causal factors, all interacting with each other through intricate feedback loops. The only way to assess complexity is to look for multiple causes and assess how they influence one another (through, for example, system thinking).
The temptation to force-fit the single cause-effect model onto complex systems is seductive for two reasons: i) our limited working memory capacity is hard-pressed to juggle multiple interacting factors at one time; and ii) the single cause-effect model works extremely well for the multitude of straightforward decisions we make every minute. So we insist that our politicians provide us with simple cause-effect explanations and implement simple cause-effect solutions. And we expect leaders of all kinds to have quick, easy solutions for all problems.
The financial meltdown of 2008 is a quintessential example of how oversimplified our assessment of a complex problem can be. We gravitate to blaming single individuals (Greenspan, Bernanke, Bush) or institutions (investment banks, mortgage lenders, rating agencies). But the shrewd financiers who avoided, even profited from, the crash were not focused on single elements. They analyzed the whole system — the web of interacting causes that collectively constituted an economic situation that was not sustainable. Likewise, the fix lies in addressing multiple elements within the system.
A smaller-scale example is the never-ending quest of businesses to improve service quality, where the tendency is towards simple solutions like "improve staff training" or "incent customer-friendly behavior." While these initiatives have merit, the perception of service quality, especially as measured by customer surveys, is influenced by a host of interacting factors such as the company's advertising, degree of positive media attention, visibility and charisma of the CEO, product selection, and many other elements that re-enforce one another. Only a multi-pronged approach fosters the kind of system that garners high service ratings: there is no silver bullet.
Many modern-day challenges are not reducible to simple cause-effect equations. Our only hope in dealing with complexity is to over-ride our intuitions with more sophisticated ways of interpreting the world. Complex problems require complex thinking; complex thinking is the more difficult path, but it is the more productive one.
Ted Cadsby
Ted Cadsby is a corporate director, principal of TRC Consulting, former executive vice-president of the Canadian Imperial Bank of Commerce, and author of two books on investing.
********************************************************
http://dreamlearndobecome.blogspot.com This posting was made my Jim Jacobs, President & CEO of Jacobs Executive Advisors. Jim also serves as Leader of Jacobs Advisors' Insurance Practice.
No comments:
Post a Comment