Asked and Answered
Management needs fewer fads, more reflection.
Jeffrey Pfeffer, PhD ’72, and Robert I. Sutton would like to foment a little revolution—one in which leaders in business and the world at large base their decisions on facts and logic, not ideology, hunches, management fads or poorly understood experience. Pfeffer, the Thomas D. Dee II Professor of Organizational Behavior, and Sutton, a professor of management science and engineering and, by courtesy, of organizational behavior in the Graduate School of Business, are the authors of Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-Based Management (Harvard Business School Press, 2006). Stanford asked them about bringing more reason to organizational life.
What’s some of the total nonsense that occurs in companies?
Sutton: Probably the biggest single problem for human decision making is that when people have ingrained beliefs, they will put a much higher bar for evidence for things they don’t believe than for things they do believe. Confirmation-seeking bias, I think, is what social psychologists call it. Organizations can have amazingly good evidence, but it has no effect on the decisions they make if it conflicts with their ideology.
Do you have a favorite unsupported belief?
Pfeffer: One would be stock options. There are more than 200 studies that show no evidence that there is a relationship between the amount of equity senior executives have and a company’s financial performance. . . . Just as you would never bet on a point spread on a football game because it encourages bad behavior, you should not reward people for increasing the spread in an expectations market.
Overreliance on financial incentives of all sorts drives all kinds of counterproductive behavior.
Evidence-based management derives from evidence-based medicine. Explain what kind of decision making we’re talking about.
Pfeffer: Almost any decision you make about any sort of intervention can be evidence-based in the sense that you can try to access what research literature and evaluation literature demonstrate about this. It’s a way of thinking more scientifically and systematically.
Sutton: The way a good doctor or a good manager works—we call it the attitude of wisdom—is to act with knowledge while doubting what you know. So if a patient goes to a doctor, you hope the doctor would do two things: first look at the literature and make the best decision given what’s available. Then actually track the progress of the treatment and see what unexpected side effects you’re having and what things are working.
And that same thing can be what happens with management. Harrah’s is an interesting case because then-COO Gary Loveman took what was known about how to make service-based organization more effective and implemented it in Harrah’s. Casinos produce lots of data, and Loveman was determined to base his decisions on that data and to do small experiments. He soon discovered that much of the conventional wisdom in the gaming industry was wrong. [For example: After he learned that local residents, not high rollers from elsewhere, were some of Harrah’s most profitable customers, Harrah’s tried promotions that de-emphasized free rooms and food, in favor of free chips.]
Does evidence-based management minimize the role of people’s experience or their intuition?
Sutton: We’re not saying scientists should replace people who practice the craft. Medicine and management are both crafts, and you can only learn to do them well by doing them over and over. But an individual doctor can’t tell if Vioxx causes heart problems because one person’s practice isn’t big enough to determine that evidence. It’s similar if you’re one manager thinking about making a merger decision, or implementing ERP [enterprise resource planning], or putting in incentive-based pay; a good manager will look at larger evidence in terms of informing the decision.
Why is experimentation underused in business?
Pfeffer: There’s a tendency for people to read a book or to go to a seminar and believe that if what they’ve learned is worth doing, it’s worth doing everywhere all at once. As opposed to saying, “Here’s an idea: maybe it works; maybe it doesn’t; maybe it could work, if we tweaked it a little.” Only by trying it in a few places and not trying it in other places and trying it in different ways can you actually learn from experience.
Hewlett-Packard experimented with different forms of incentive pay: they said it may work; it may not work; we’ll try it; we’ll try it in various places, in different ways in different places. Then Carly Fiorina came in and threw out all the knowledge they had learned. But they had learned a lot. It wasn’t a huge top-down thing or a following-that-fad thing—it was the spirit of inquiry, of thinking like a scientist.
Your book has lots of good words to say about failure.
Sutton: I hate when failure happens, but it’s a necessary evil.
Pfeffer: You have to be willing to fail. If you’re playing golf and every ball goes into the cup, that means you’re either Tiger Woods or you’re standing too close to the cup.
Everyone talks about learning organizations, but if you’re learning, you’re obviously doing things that—by definition—you’re not really good at. You’re therefore going to fail. If I say to you, the first time you get on your bike, ‘You have to do it perfectly or I’ll cut off your legs,’ no child will ever learn to ride a bike. If I want you to learn, I’ll have to tolerate your not being good at first.
Sutton: So there’s no way you can do an empirical case or logical case without a high failure rate. Dean Simonton, who is a creativity researcher at UC-Davis, has studied all the greatest people and compared them to their ordinary contemporaries. Artists, scientists, composers—he’s got sample after sample. He always ends up finding that people who are most successful or famous in fields don’t have any lower failure rate than everyone else—they just do more.
Pfeffer: Effort and persistence are really undervalued.
Sutton: To put in a plug for the new Stanford design school—that’s one of the things we’re teaching students in terms of understanding a rapid-prototype culture, to just get out there and do stuff.
Pfeffer: But learn as they’re doing.
You warn that decisions need to be driven by evidence, not ideology. What organizational behavior in the world at large has the evidence clearly in place, but the evidence is going unheeded?
Sutton: Exhibit 1 is the field of education. We do the same stupid things over and over. The two extreme cases are incentive-based pay for teachers and social promotion. Incentive-based pay is actually very powerful and has a huge effect on behavior, but it’s so powerful that it has unintended consequences—like motivating teachers to cheat to boost test scores. Social promotion—the passing ahead of kids who aren’t qualified—goes against a lot of American values. But it’s the lesser of two evils. I think there are like 70 or 80 studies that show the same thing: that when you start holding kids back, test scores go down, dropout rates go up, school costs go way up as classes get bigger.
In a chapter that should warm the hearts of wage-earners everywhere, you say work shouldn’t be different from the rest of life, that people should not have to adapt to constrained corporate roles. Why not?
Pfeffer: It takes an enormous amount of psychological effort to try to go to work and remember that you can’t be who you are and have to be someone you aren’t. The example that opens the chapter is wonderful: an executive who starts off her career and she’s a very gregarious and boisterous and loud person who heard people say, ‘This is not how you behave here.’ So she decided to find a place where she could be herself or to create a place where people can be who they are.
DaVita is a very serious business—they do kidney dialysis—but the company encourages clients and employees to have fun. People always ask, how can you be boisterous and have games and dress up for the kidney dialysis center on Halloween? It’s a very serious business, and they suggest that that’s why it has to be fun: if you don’t relieve stress and tension with a certain amount of joy, it becomes just overbearing and burdensome. I think the biggest reason DaVita has the best mortality rates in the kidney dialysis industry is exactly this attitude, because emotions are contagious.
Sutton: There’s other damage beside the cost to individual authenticity. By letting people be themselves, organizations get new roles and skills that actually help the organization in the long term . . . . At Microsoft, the guys who were working on Direct X were told to stop working on it over and over again. Then Direct X became a huge business. Authenticity shines through despite, not because of, top management.
Pfeffer: And that’s relevant to strategy. You can’t always anticipate what’s going to be successful, you can’t always plan how markets will evolve, so you need to create opportunities for learning by doing.
Excerpted by permission of Harvard Business School Press. HARD FACTS, DANGEROUS HALF-TRUTHS, AND TOTAL NONSENSE: Profiting from Evidence-Based Management, by Jeffrey Pfeffer and Robert I. Sutton. Copyright © 2006 Jeffrey Pfeffer and Robert I. Sutton. All Rights Reserved.
CULTIVATING AN ATTITUDE OF WISDOM
Wisdom, not intelligence, is probably the most important talent for sustaining organizational performance. Organizations need people who think quickly and well when they work alone on problems with known correct answers—that is what IQ tests measure. But having people who know the limits of their knowledge, who ask for help when they need it, and are tenacious about teaching and helping colleagues is probably more important for making constant improvements in an organization, technical system, or body of knowledge. Also, as research on intelligence suggests, such wise actions help people become smarter and smarter.
Our first big clue about the wisdom-performance link came when we studied IDEO. Founder and chairman David Kelley attributes IDEO’s success to “hiring some smart people and getting out of the way.” But that is only part of the story. The rest of the story is that Kelley can get out of the way because IDEO has intertwined cultural values, work practices, and rewards—a system—that requires little intervention from senior management. Kelley deserves much credit for designing and tinkering with this system during the 20-plus years that he served as CEO. Gwen Books, who was Kelley’s assistant for over a decade, once told us, “David never stops thinking about IDEO, it is like a prototype he is constantly redesigning in his head.” The genius of the design is that IDEO was, and still is under current CEO Tim Brown, largely run and policed by peers.
One of the main reasons that IDEO’s system works so well is the attitude its people have toward knowledge. This attitude of wisdom is essential for practicing evidence-based management. Wisdom is about “knowing what you know and knowing what you don’t know.” This attitude enables people to act on their (present) knowledge while doubting what they know, so they can do things now, but can keep learning along the way. Wise people realize that all knowledge is flawed, that the only way to keep getting better at anything is to act on what you know now, and to keep updating.
This table displays key elements of how wise (and unwise) people think and act in organizations. These elements stem from theory and research, but an episode at IDEO provides perhaps the best summary and explanation. Robert Sutton was sitting with two engineers, Larry Schubert and Roby Stancel, who were talking about designing a device for Supercuts, a chain of hair salons that specializes in inexpensive, fast haircuts. They were talking about a device that could be attached to an electric razor to vacuum away cut hair. We were in front of Rickson Sun’s workstation. Rickson looked mildly disturbed as he shut his sliding door to muffle the noise of our meeting—a futile gesture because his stylish cubicle had no roof and low walls. Rickson still looked a bit annoyed when he emerged minutes later to tell us that he had once worked on a product with key similarities to the device the Larry and Roby were designing—a vacuum system that carried away the fumes from a hot scalpel that cauterized skin during surgery. He also brought out a report describing different kinds of plastic tubing sold by vendors. Larry Schubert commented, “Once Rickson realized he could help us, he had to do it, or he wouldn’t be a good IDEO designer.”
This simple episode illustrates the attitude of wisdom and why it enables people to keep learning and systems to keep getting better. Larry and Roby are smart people, but knew that if they acted like know-it-alls, the design would suffer. They deferred to Rickson’s knowledge. They reacted with a kind of confident humility we saw many times at IDEO. When Rickson offered to help, they knew and he knew that to improve the design they had to listen to him, and follow up on his offer to help in the future.
IDEO may sound like a warm and fuzzy organization, but no matter how smart designers are, if they refuse to work cooperatively and act unwise, the consequences are swift and unpleasant. Every now and then, IDEO hires designers who act as if taking time to help others—and asking for help—aren’t parts of their job. These selfish designers are the subjects of nasty gossip. They are teased, shunned, and given boring work. Some learn to be wise. Those who don’t are treated as if they are invisible; there is no need to fire them, they realize that they better leave.
IDEO is just one organization that fosters and displays an attitude of wisdom. Others include Southwest Airlines, the Toyota Production System, and U.S. Navy aircraft carriers with remarkable safety records. Once you start studying organizations where people keep learning and moving forward, and where systems keep getting better instead of inducing the same errors again and again, you see the attitude of wisdom.
ENCOURAGE PEOPLE TO BE NOISY AND NOSY—IT PROMOTES WISDOM
Here is a trick question. Imagine that you just had a major operation, and are given the choice: Do you want to stay in a nursing unit that administers the wrong drug or the wrong amount, or forgets to give the right drug, about once every 500 patient days; or would you rather be in a unit that blunders 10 times as often? In the mid-1990s, Harvard Business School’s Amy Edmondson was doing what she thought was a straightforward study of how leader and coworker relationships influence errors in eight nursing units. Edmondson, and the Harvard physicians funding her research, were flabbergasted when nurse questionnaires showed that the units with best leadership and best coworker relationships reported making 10 times more errors than the worst!
Puzzled but determined to understand this finding, Edmondson brought in another researcher to observe these nursing units. Edmondson didn’t tell this second researcher about her findings, so he wasn’t biased. When Edmondson pieced together what this researcher observed with her findings, she realized that better units reported more errors because people felt psychologically safe to do so. In these units, nurses said “mistakes are natural and normal to document” and “mistakes are serious because of the toxicity of the drugs, so you are never afraid to tell the nurse manager.” In the units where errors were rarely reported, nurses said things like “The environment is unforgiving, heads will roll.” The physicians who helped sponsor her research changed their view of medical errors 180 degrees. They no longer saw errors as purely objective evidence, but partly as a reflection of whether people are learning from and admitting mistakes or trying to avoid blame and, in the process, possibly covering things up.
Edmondson and her colleagues have since done multiple studies on how hospitals, surgical teams, doctors, and nurses learn from problems and errors, which reveal much about talents and behaviors that promote wisdom. Especially pertinent is a study of nurses that examined 194 patient care failures, everything from problems caused by broken equipment to drug treatment errors. Edmondson and colleague Anita Tucker concluded that those nurses whom doctors and administrators saw as most talented unwittingly caused the same mistakes to happen over and over.These “ideal” nurses quietly adjust to inadequate materials without complaint, silently correct others’ mistakes without confronting error-makers, create the impression that they never fail, and find ways to quietly do the job without questioning flawed practices. These nurses get sterling evaluations, but their silence and ability to disguise and work around problems undermine organizational learning. Rather than these smart silent types, hospitals would serve patients better if they brought in wise and noisy types instead.
This table lists these talents of wisdom. All of these characteristics help people act on what they know, and keep improving their own skills, peers’ skills, and organizational practices and procedures. The crux is, if you want better performance instead of the illusion of it, you and your people must tell everyone about problems you’ve fixed, point out others’ errors so all can learn, admit your own errors, and never stop questioning what is done and how to do it better. These actions can annoy doctors and administrators—or any other authority figure—who prefer quiet and compliant underlings, but if we want organizations that do as much good and as little harm as possible, these talents are essential.
The Effort Effect
Bananas Are Berries?
Let Me Introduce Myself
The Case Against Affirmative Action
What to do With Your VHS Tapes: Essential Answer
Data is from the past two weeks.