Advertisement

Hunches Rule Us -- and Fool Us

Share via

It’s just a gut feeling, but I think this is a two-front war. I mean the raging war about whether to go to war.

Most of the attention is absorbed by the evidentiary case, such as it is. The Economist magazine summed it up with a courtroom headline: “The Burden of Proof.”

The other front in the war, though, isn’t about proof. It’s where this whole showdown began: down at the level of instinct, or intuition.

Advertisement

In Bob Woodward’s book “Bush at War,” the president repeats himself in public and in private. He describes precisely what I take to be the truth about his determination to confront Saddam Hussein: “I’m not a textbook player. I’m a gut player.”

The facts about Iraq? All this talk of smoking guns, all these claims and counterclaims, this dance of inspections, all of this “Judging the Case,” as Newsweek put it, may in fact be missing the primary point.

After hundreds of interviews with people in the know and four hours speaking with the president, Woodward concluded this: “It’s pretty clear that Bush’s role as a politician, president and commander in chief is driven by a secular faith in his instinct -- his natural and spontaneous conclusions and judgments. His instincts are almost his second religion.”

Advertisement

The rest of us, it turns out, are not much different.

Intuition, our capacity for knowledge or insight apart from observation or reason, is “a big part of human decision making,” explains David G. Myers, a psychologist at Hope College and author of the 2002 book “Intuition: Its Powers and Perils.”

A sign in Einstein’s office put it this way: “Not everything that can be counted counts, and not everything that counts can be counted.”

Thus, it’s intuition, not fact, that drives commentators with minimum military experience to predict the catastrophic consequences of house-to-house combat in Baghdad. Or the converse: that Iraqis will greet their liberators with gratitude and that democracy in the Mideast needs only an open patch of desert in which to take root.

Advertisement

I put a call in to Myers. Having analyzed the best that science can tell us about intuition, he offered some sobering thoughts:

“When tasks are challenging, people are usually more confident than correct.... Moreover, the most confident people tend to be the most overconfident.... Our judgments are better than chance but generally not as good as we think.... Intuition can dangerously mislead us.”

Of course, intuition also transforms the audacious into history’s heroes, too -- when they’re right. There’s the chance of being flat wrong, too.

Standing on the Civil War battlefield at Spotsylvania in 1864, Union Gen. John Sedgwick is said to have remarked about the massing Confederate troops: “They couldn’t hit an elephant at this dist.... “

The decision to launch a war invariably rests on the aerobatics of flying by the seat of our pants. In his book “The Lost Japan,” journalist Hasegawa Nyosekan said this about Pearl Harbor: “The war was started as the result of a mistaken intuitive calculation which transcended mathematics. We believed with blind fervor that we could triumph.”

Those who have studied intuition at work note this: Once it kicks in, people tend to sustain their views by seeking supportive information, which may account for the fact that the reports from weapons inspectors in Iraq have served to harden preconceived views.

Advertisement

An awareness of how we make our decisions, Myers argues, serves to keep our confidence in touch with reality. “Failing to appreciate our potential for error when making business, political or military decisions can have devastating consequences,” he says.

Columbia University psychologist Janet Metcalfe offered these cautionary words in a 1998 article for the Personality and Social Psychology Review:

“People think they will be able to solve problems when they won’t; they are highly confident that they are on the verge of producing the correct answer when they are, in fact, about to produce a mistake; they think they have solved problems when they haven’t; they think they know the answers to information questions when they don’t; they think they have the answer on the tip of their tongue when there is no answer; they think they produced the correct answer when they didn’t, and furthermore they say they knew it all along; they believe they have mastered learning material when they haven’t; they think they have understood, though demonstrably they are still in the dark.”

Advertisement