Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Putting Psychology to Work in the Public Sector

Three new books offer public officials insights into how people make judgments — and how institutions can make better decisions.

Psychology is having a moment. Over the course of the past three decades, academic psychologists have used ingenious experiments to produce striking insights into how people think, form commitments, and make decisions. During the 1990s, their work had a profound influence on academic disciplines such as economics, challenging the assumption that people should be understood as rational calculators of self-interest. Today, it’s transforming another field — the field of public policy.


Public officials routinely make difficult decisions of great importance. Sometimes, these decisions are fundamentally political: they involve deciding between competing — if not conflicting — values. However, government officials also make many decisions that are analytic. They attempt to predict future events and plan for them. They make decisions about whom to hire. They assemble committees or working groups that hold meetings and develop solutions to complicated problems. Most public officials try hard to do these things well. Read the literature on psychology and decision-making, though, and an inescapable conclusion emerges: most executives go about handling these manners in ways that are all wrong. 

Every day, people make thousands of decisions automatically and well. A psychologist would say we make them intuitively. But public sector leaders — and indeed, people in general — also routinely confront decisions that would benefit from more sophisticated thinking. Unfortunately, we often to fail to recognize those circumstances, defaulting instead to the decision recommended by our intuition. Where intuition is informed by practice and experience that can work well. An excellent example is an emergency room doctor’s intuition, which has been informed by years of practice.

However, psychology has established that when it comes to truly complicated decisions, intuition is not our friend. This insight is rooted in the work of two psychologists in particular, Daniel Kahneman and Amos Tversky. In 1979, Kahneman and Tversky began to conduct a series of experiments that documents the ways people making decisions fall pretty to predictable biases. In 2002, their work won them the Nobel Prize in economics. In his book, Thinking Fast and Slow, Kahneman brings those insights to a broader audience. The result is a comprehensive taxonomy of how our intuitions mislead us, and how we might go about making better decisions.

Jonathan Haidt’s The Righteous Mind: Why Good People are Divided by Politics and Religion extends these insights to the world of politics. In it, Haidt argues that politics is largely about moral “tastes” rather than whose policies are analytically stronger. Understanding intuition, not marshaling facts, is how Haidt believes public officials should proceed. 

Duncan Watts’s Everything Is Obvious* (How Common Sense Fails Us) shifts the focus from individuals to organizations. It describes the perils of what psychologists call “hindsight bias” and discusses how organizations can combat it. Read together, these books offer public officials a guidebook to how we think — and how we can think better. At a time when partisanship, ideological rigidity, and personal attacks are ever more common, these are strategies no public official can afford to ignore.

Intuition and Decisions

Daniel Kahneman’s Thinking Fast and Slow starts by introducing us to our minds. Note the plural tense: Kahneman argues that we should actually think of ourselves as having two minds, System 1 and System 2. System 1 is basically our intuition. It operates quickly and automatically. It handles the thousands of decisions we make every day with amazing ease. Almost everything we do is governed by System 1. But there is one thing System 1 is not good at — making difficult, analytic decisions. Those tasks are handled by System 2, the conscious, reasoning part of our mind that appears when we are doing calculations or considering a thorny problem.

We have been trained to believe that System 2 — the conscious, reasoning part of our minds — is basically in charge. But two generations of research have shown that reason seldom plays a role in our decision-making. In fact, we go to great lengths to avoid reasoning. People “find cognitive effort at least mildly unpleasant and avoid it as much as possible,” notes Kahneman. Using System 2 is tiring. Force people to think, and they might even get nasty.  “People who are cognitively busy,” writes Kahneman, “are also more likely to make selfish choices, use sexist language, and make superficial judgments in social situations.”

Perhaps it’s just as well then that few of us are cognitively busy. Instead, most of us rely on intuition — System 1 — to make even those decisions that should be made by System 2. Expert intuition informed by years of practice that results in superior pattern recognition can make certain types of expert judgments fast and reliable. But when confronted with more complex scenarios, such as predicting world events, expert intuition fails. Numerous experiments have shown that experts have greater confidence about their predictive abilities but achieve no higher rate of success. The average reader of the New York Times is as likely to predict the future of the Middle East as Thomas Friedman.

Much of Kahneman’s book is devoted to describing the different kinds of systematic intuitive errors system people make. A series of succinct chapters march readers through such tendencies as the “halo effect,” the tendency to ascribe impressions about observed attributes to unobserved attributes. (“He arrived promptly at our meeting and was well prepared. We should hire him as our new IT person.”); the anchoring effect, whereby the first figure mentioned strongly influences subsequent estimates or offers; availability bias, the tendency to worry about the fears that come to mind quickest, without regard for actual probabilities,  exposure effect, whereby sheer repetition increases favorable feelings), and many more.

The most powerful bias, however, is confirmation bias. We are tenacious in searching out evidence that validates our prior beliefs and ferocious at critiquing arguments that challenge them. We also tend to be overconfident. These tendencies can be dangerous for organizations considering new initiatives. Fortunately, Kahneman has clear recommendations on how to correct them. Say we are putting in place a new IT system. Instead of developing a plan and estimating the likelihood of success based on our assessment of our team’s competency and the challenges we must surmount, Kahneman would have us seek out another fact: how often do similar projects come in on budget? We may be 80 percent confident that we can bring a project in on schedule and under budget, but if only 20 percent of people in similar positions succeed, Kahneman believes that we should fire up System 2 and accept that our chance of success is closer than to half and half than we might want to believe.

Want more management news & commentary? Click here.

The way most organizations run meetings is something Kahneman (along with virtually every other organizational psychologist) would change. Who isn’t familiar with this scenario? An important decision must be made, and so a working group is formed. Then, a meeting is called where the decision is made. This is where things go wrong. The first person speaks. Immediately, the anchoring effect, confirmation bias, the halo effect and other biases begin to work. Instead of bringing challenging outside perspectives to the meeting, participants begin to cluster around the first opinions expressed. Instead of considering diverse opinions and a broad of possible outcomes, the group instead considers only a narrow range of options.

Kahneman would have such meetings run in a very different fashion: he would have the meeting organizer ask participants to deliberate in advance and then bring short, written recommendations to the meeting. The result of this is a broader range of recommendations less influenced by bias and social pressure. The result, psychologists agree, is better decision-making.

Kahneman’s book is replete with insights and advice. However, the overall picture that emerges of people as decision-makers isn’t very hopeful. We jump to conclusions quickly. We ignore evidence that doesn’t fit our story. We are slow to seek out confounding evidence. Kahneman coins an acronym to describe it: WYSIATI — What you see is all there is. We are suggestible. What most of us describe as reasoning is really just rationalizing. By identifying the circumstances under which we make such errors, Kahneman hopes to give readers the tools to recognize when System 1 may be leading them to bad decisions or judgments.  However, he’s not particularly sanguine about the likelihood that he will succeed. 

“Correcting your intuitive predictions is a task for System 2,” says Kahneman. Unfortunately, “[s]ignificant effort is required to find the relevant reference category, estimate the baseline prediction, and evaluation the quality of the evidence. The effort is justified only when the stakes are high and when you are particularly keen not to make mistakes.”

Indeed, most of us have a hard time recognizing that we are relying on an untrustworthy shortcut in the first place. “There is no simple way for System 2 to distinguish between a skilled and a heuristic response,” he acknowledges. “Its only recourse is to slow down and attempt to construct an answer on its own, which it is reluctant to do because it is indolent.” 

Justification and Moral Judgments

Psychologist Jonathan Haidt, the author of The Righteous Mind: Why Good People are Divided by Politics and Religion, shares Kahneman’s belief that we should think of the mind as two systems. But where Kahneman speaks of System 1 and System 2, Haidt likens our mind to an elephant with a rider atop it. The rider (which corresponds to Kahneman’s System 2) is not in control of the elephant (System 1). Like Kahneman, Haidt believes the role reason most commonly plays is one of justification, not judgment, which happens automatically and subconsciously. But while Kahneman sees this as a serious problem, Haidt appears to be unconcerned about the fact that the job of the rider is “to serve the elephant.” To some extent, that reflects his different area of focus. Kahneman is concerned primarily with decision-making and how people err in making complicated choices. Haidt is interested in a different type of judgment — moral judgments. 

Haidt makes a provocative claim about morality. He argues that moral judgments — and, by extension, political judgments — are like tastes. Just as some people prefer sweet food and others sour, so too do some people prefer to concern themselves with notions of harm and fairness while others are more concerned with liberty, loyalty, authority, and sanctity. Democrats have a taste for the first two values; Republicans have a more pronounced taste for the other values. Political partisans differ in their conception of fairness. For Democrats, the term summons up images of equality. For Republicans, fairness is about proportionality; meaning people should get what they deserve based on what they’ve done. 

The implications of Haidt’s argument are disquieting. Forget trying to convince someone to accept your political opinion via argumentation. Would someone who likes sour food really expect to convince someone who prefers sweet food by making an argument that sour food is tastier? Much of what passes for political discourse is, in Haidt’s view, a waste of time. Or rather, it is dialogue that serves a different purpose — not convincing the other side that you are right but rather binding partisans more tightly to their political teams by sharing common narratives. 

Group cohesion has many virtues — particularly in small, competitive hunter-gathering societies. Unfortunately, however, this tendency serves society circa 2,000 BC better than it serves society circa 2,000 AD. Binding oneself to one narrative tends to blind people to other narratives. Fortunately, there are more effective ways for politicians to approach the task of persuasion.

“If you want to change people’s minds, you’ve got to talk to their elephants,” Haidt writes. “The main way that we change our minds on moral issues is by interacting with other people…. If there is affection, admiration, or a desire to please the other person, then the elephant leans toward that person and the rider tries to find the truth in the other person’s arguments.”

Of course, this is something the most partisan people among us are loath to do. Grappling with a different political perspective means firing up the dorso-lateral prefrontal cortex — the part of the brain that handles reasoning — and using System 2. That is tiring and unpleasant. In contrast, turning on to the Rush Limbaugh Show or tuning into Rachel Maddow provides a delightful dopamine release from the ventral striatum, the brain’s pleasure center. 

“The partisan brain has been reinforced so many times for performing mental contortions that free it from unwanted beliefs,” writes Haidt. “Extreme partisanship may be literally addictive.” 

Hindsight Bias

The title of sociologist Duncan Watts’s book Everything Is Obvious*: How Common Sense Fails Us comes with an asterisk. That asterisk directs the reader to an upside-down footnote on the bottom of the cover that reads, “Once You Know the Answer.” This, in short, is a book about hindsight bias. 

Consider one of the business world’s most famous case studies, the story of Sony’s Betamax v. VHS. Sony was the first company to introduce a device that could record and play video. The alternative technology embraced by its rivals, VHS, was inferior in many ways. However, it was cheaper and supported by more manufacturers. As readers of a certain generation know, VHS won. A similar story about how Apple lost out to Microsoft seemed to unfold in the early days of the personal computing. As a result, when Apple launched the iPod twelve years later, many analysts predicted that it would soon by routed in the marketplace by an MP3 player that was cheaper and more “open.” It didn’t happen. But that hasn’t stopped many technology experts from arguing that it’s only a matter of time till the iPod and the iPhone succumb to the Android or some other more “open” device.

There’s one problem with this argument. As the strategy consultant Michael Raynor has pointed out, the most common interpretation of why Sony’s Betamax lost is actually wrong. Sony expected people to use their Betamax not to watch movies but rather to tape shows. Sony turned out to be wrong, but it was not as a result of bad planning. Indeed, most readers of this article probably have a digital video recorders attached to their television today. Sony’s strategy wasn’t unsound. Rather, Sony was simply unlucky.

The idea that luck and contingency rather than skill and effort play large roles in determining the success or failure of most undertakings is not a congenial one. People make sense of the past not by acknowledging complexity but by telling stories. The simpler the story, the more persuasive it tends to be. This can create a dangerous dynamic. When something goes horribly awry in the public sector, there is inevitably a desire for so-called “accountability.” Hindsight bias lends false clarity to what was unclear at the time; the desire to tell a simple story takes over. Government agencies that have confronted these tendencies often respond by becoming even more bureaucratic.

“Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions — and to an extreme reluctance to take risks,” writes Kahneman in Thinking Fast and Slow.

Hindsight bias and our tendency to embrace simple stories that establish causation can also lead organizations to embrace the wrong types of leaders. “Leaders who have been lucky are never punished for having taken too much risk,” notes Kahneman. “Instead, they are believed to have flair and foresight to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.”

Hope for Organizations

Kahneman and Haidt have written fascinating, expansive books that will change the way readers approach decision making and, in the case of Haidt’s book, the arts of politics and persuasion. (Watts’s more focused book, while also thought provoking, is better-suited for the management aficionado.) All three authors offer the hope that self-awareness will lead readers to correct their behavior. All three authors also know enough to know this is unlikely to occur very often. Organizations rather than individuals may be the better place to look for hope. 

“We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play,” writes Kahneman. “Organizations are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures,” he continues. “Organizations can institute and enforce the application of useful checklists, as well as more elaborate exercises, such as reference-class forecasting and the premortem,” whereby decision makers are asked to write a description of a policy’ or initiative’s failure before it is implemented in order to identify potential problem areas.

Watts, a sociologist, offers numerous suggestions for how organizations can avoid the kinds of decision-making errors that Kahneman catalogues. One of the approaches he champions is known as “strategic flexibility.” Instead of developing a single plan for the future, Watts, following business management expert Michael Raynor, argues that organizations should actually formulate a portfolio of strategies, with the actual strategy chosen to reflect the individual circumstances the organization faces. 

This, of course, is complicated and expensive, and, of course, it doesn’t always work.  Watts suggests that some organizations might fare better with a different approach, “measure and react.” It’s an approach closely associated with the Spanish clothing retailer Zara, which is known for identifying emerging fashion trends and producing clothes that reflect them quickly, often in a matter of weeks rather than years. 

Kahneman shares Watts’s hopes for organizations.  “If you put the individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasons as an emergent property of the social system,” Kahenman writes. “This is why it’s so important to have intellectual and ideological diversity within any group or institution whose goal is to find the truth (such as an intelligence agency or a community of scientist) or to produce good public policy (such as a legislature or advisory board.”

At a time when the public sector is under unprecedented pressure to perform better with few resources, these are insights that no public sector leader should ignore. 

Caroline Cournoyer is GOVERNING's senior web editor.