Monday, February 22, 2010

I Would Gladly Pay You Tuesday for a Hamburger Today (Part 9 of Cognitive Biases)

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them!

This week, meet the Hawthorne effect (being watched makes you work harder), the herd instinct (tendency to follow the crowd), hindsight bias (I knew it all along), and hyperbolic discounting (I would gladly pay you Tuesday for a hamburger today.)

Part 7 has links to all the previous installments. Part 8 is here.


Hawthorne Effect

The Hawthorne effect is often portrayed as aa sort of Heisenberg uncertainty principle for the social sciences: observer interacts with observed through the process of observation. In practice, almost any sort of internal improvement effort will have a short-term positive effect on performance, a placebo effect that benefits all of us in the management consulting world.

The original experiments on which the Hawthorne effect is based took place from 1924 to 1932 at the Hawthorne Works, a Western Electric plant outside of Chicago. A group of six women worked in a special room assembling telephone relays and dropping them down a chute. The most famous and oft-cited of those experiments involves a study of how illumination levels affected the rate with which these women dropped finished relays down the chute. Over a five year period the researchers also changed pay rules, varied break frequency and duration, and shortened and lengthened the workday, all to the tune of the drip-drip-drip of falling relays.

There was, interestingly, no double blind in the experiments. The women were fully aware they were being studied, and even suggested some of the experiments themselves. The lack of control over the numerous variables has led to a wide range of interpretation about what — if, indeed, anything — the studies really mean.

Herd Instinct

Herd behavior was well-known to exist in animals, but Friedrich Nietzche was the first to use the concept of “herd instinct” as one more reason to have contempt for the human species. There’s nothing inherently wrong, however, with acting as part of a group. In many circumstances, the natural tendency of a group to move in the same direction can increase safety. Of course, sometimes herds head over the edge of the cliff.

As noted earlier, calling something a cognitive bias isn’t the same as calling a biased decision wrong or stupid. If a crowd is fleeing in a particular direction, it may be a false alarm, but then again, they may know something you don’t. If danger doesn’t appear imminent, taking a few minutes to look around is a better way to balance your risks.

Hindsight Bias

Once you know how it turned out, a certain sense of inevitability creeps in. The signs were always there, and the people in charge should have known the truth all along.

The frequently repeated libel that FDR, for example, knew in advance about the impending Pearl Harbor attack and remained silent for political reasons is a case in point. (I won’t rehash the argument in detail, but I’m always appreciative of the Straight Dope’s accuracy and balance on almost any topic.) The argument relies on the idea that in the mass of raw data, decision-makers could have recognized in advance exactly which bits of information were salient. This is nonsense. Reading the future forward is orders of magnitude more difficult than reading it backward.

This particular bias is aided by our own tendency to believe that when we turn out to have been right that we “knew it all along.” Before-and-after measures of certainty tend to vary a lot.

Hyperbolic Discounting

“I would gladly pay you Tuesday for a hamburger today.” Wimpy, the hamburglar pal of Popeye the Sailor Man, liked his rewards up front and his penalties delayed. People in general tend to prefer the bird in the hand to a flock in the bush. That’s a fairly well-known cognitive bias.

What’s not so well known is the amount of the discount — how much will you give up in the future to receive the benefit today? Behavioral economists believe the relationship is hyperbolic. We’ll take a dollar today in preference to three dollars tomorrow.

But given a choice between a dollar 365 days from now and three dollars 366 days from now, we’ll gladly wait the same extra day for three times the payoff. Our choices are inconsistent over time: we’ll commit our future self to a course of action (waiting a day) that we aren’t willing to follow today.

This is often irrational, but not always. Depending on the uncertainty of the reward, a definite dollar today may be preferable to the possibility of three dollars tomorrow.

This particular cognitive bias shows up in studies of how people save for retirement, borrow on their credit cards, procrastinate on important tasks, and deal with the consequences of addiction. Especially where hamburgers are concerned.

Tuesday, February 16, 2010

Requirements and Horses' Asses

There’s a common joke that makes the rounds of forwarded Internet email that claims the width of the booster rockets on the Space Shuttle) are constrained by the width of an ancient Roman chariot (more colorfully, by “a couple of horses’ asses”). Here’s the linkage:

  • Why are the boosters the width they are? So they can fit through train tunnels.
  • Why are train tunnels the width they are? Because of the gauge of the track.
  • Why do we use that particular track gauge in the US? Because it was built by British engineers who used the British standard.
  • Why was that track gauge the British standard? Because it was based on existing horse-pulled mine cars that ran on rails.
  • Why was that the width of mine cars? Because it was the standard cart width.
  • Why was that the standard cart width? Because it was the spacing of ruts on British roads.
  • Who made the ruts? The Romans.
  • Why was that the spacing of the ruts? That was the standard width of a Roman chariot.
  • Why was that the standard width of a Roman chariot? Because the chariot was pulled by two horses.
  • Ergo, the requirements for the Space Shuttle booster rocket width was originally set by a couple of horses’ asses.

The story is exaggerated in its specifics, but it does point out a useful insight: The present is sometimes constrained by the past.

Builder Robert Moses, famously opposed to public transportation, engineered bridges on New York’s Wantagh Parkway with an ulterior motive–he purposely ensured that they were too low for buses, and buses still do not run on his parkway. Similarly, we type on a keyboard with the top row spelling QWERTYUIOP, a jumble of letters designed to slow down our typing so that the keys on our manual typewriter do not jam as they hit the paper.

Why don’t we change to a more efficient system? Too much effort. Legacy costs. Infrastructure limitations. Inertia.

Thousands of business books talk about change resistance as some sort of failure. Your people – or managers – just aren’t flexible enough. Personal agendas undercut corporate improvement. You don’t have the right kind of corporate culture.

We can list numerous reasons, but it all comes down to physics. Inertia is as much a human and organizational concept as it is a law of the physical universe. And why wouldn’t that be true? It would be far more surprising if organizations were exempt from the law that drives everything else in the universe.

If resistance to change is just inertia wearing a clever plastic disguise, what does that teach us? Well, inertia is the tendency of an object at rest to stay at rest, but more importantly, the tendency of a body once in motion to stay in motion, unless acted upon by an outside force. To change the moment of inertia of an object, you apply force over time. But no matter how flexible the management team, you can’t flip the Exxon Valdez around like a speedboat. Massive objects (large companies) are simply harder to move. There’s no point in getting frustrated about it, which is often the reaction. Instead, be realistic about the force you can apply in the time period over which you can apply it in contemplating which changes to pursue.

If inertia applies in the corporate world, what about the other rules of physics? It’s easy to see the everyday effects of friction in human interactions. Why do meetings take so long? Why does it matter so much what the people at work like each other or get along? Well, when moving parts rub up against one another, friction is the result. To overcome friction, you need lubrication. Good manners, kindness, and the spirit of teamwork work wonders as well.

And, of course, there’s entropy, or the tendency of systems to move in the direction of chaos. Things fall apart unless new energy is applied. That great new management initiative will deliver great results just as long as you keep pumping energy into it. Decide you are done, and immediately it begins to unravel.

There are structural limits that reality places on change. Applying these physics concepts of inertia, friction, and entropy allows us to understand more deeply how organizations, systems, societies, and individuals work. They help you identify possible futures, constrain our analysis of alternatives, and point out the paths of least resistance along which history tends to flow.

Tuesday, February 9, 2010

Weak Faith, Strong Faith

As the old joke goes, when the only tool you have is a hammer, all problems look like nails. Applying a single frame to a variety of problems is one of the underlying characteristics of many cognitive biases. Science, law, and faith are three popular frames that cover complex and contentious issues ranging from abortion to evolution, from global warming to the value of prayer in the schools. Let’s look at how they work.

Science bears the same relationship to knowledge as law does to justice. It’s a process, and as such inherently imperfect. To compensate for its imperfections, we choose certain biases. In American law, “innocent until proven guilty” is such a bias. It’s a deliberate attempt to make errors fall more heavily in one direction than another. The injustice of allowing some guilty people to escape is, we believe, less than the injustice of imprisoning an innocent person. The appeals process supplies quality assurance and quality control. Even so, mistakes occur. Law does not provide perfect justice.

The process of science is based on the experimental method. Through repeatable experiments, we test hypotheses. From the results, we develop theories, and those theories are tested in turn. Over time, the scientific community reaches a consensus, and that’s the state of scientific knowledge at a particular point in time. Inertia applies: it takes a lot of work and effort to build a case that will persuade the scientific community to accept a change, but once the change is accepted, it tends to stay accepted.

The burden of proof necessarily rests on the side challenging the consensus. That’s only fair and proper. Suggesting the entire scientific community professes a certain position because they’re either suffering from mass delusion or outright corruption strains credulity. Saying they’re wrong, on the other hand, is fair game — if you have the evidence to back it up. And the scientist who successfully overturns the consensus apple cart to create a new paradigm goes down in the history books. That’s a powerful incentive for change.

We must remember, on the other hand, Carl Sagan’s dictate, “The fact that some geniuses were laughed at does not imply that all who are laughed at are geniuses. They laughed at Columbus, they laughed at Fulton, they laughed at the Wright brothers. But they also laughed at Bozo the Clown.” Science does not provide perfect knowledge.

I seldom take the time to study all the scientific literature on any issue, and in many areas would be unable to analyze it properly if I did. Nor do I take the time to study the details of a legal case prominent in the papers, and often don’t know the underlying law well enough to analyze it correctly if I did. I have to take the results on faith.

Faith can be a bright, shining beacon of certainty in a murky world, or it can be a weak thing, provisional and tentative. The first kind of faith trumpets itself as the possessor of Truth Revealed. (Or as Mark Twain puts it, “[Man is] the only one who’s got the true religion — several of them.”). The idea of certainty fills a deep-seated human need, and I understand the temptation. Uncertainty is inherently uncomfortable. The gospel of peace is the comfort of knowing. But faith, any more than science or law, does not reliably produce perfect truth.

Provisional faith is faith without certainty. Yes, perhaps all of reality could be an elaborate illusion, but that’s not the way to bet. We each make hundreds of assumptions as we go through each day, and we have no practical choice in the matter. Faith in some things is not an option. On the other hand, we’ve been wrong before. Like law and science, faith requires biases to work, along with a skeptical mind to provide checks and balances.

In place of justice, we have law. In place of knowledge, we have science. In place of truth, we have faith. Certain faith is easily visible in the religious sphere, but true believers are found in science and law and in every other discipline that attempts to standardize the chaos of reality.

Faith has a fundamental place in human affairs, but a weak and skeptical faith is worthy of more respect than the strong and bright kind. When someone is strong in his or her faith, certain of his or her rightness, absolute in knowledge of Truth, it's time to hold onto your wallets. Unchecked, the gospel of certainty become the opiate of its adherents.

Comfort can come at far too high a price.