Consensus Building and the Unshakable Rightness of Belief

Anyone who’s worked at building consensus on public policy knows the frustration of trying to reason with someone who just won’t change a position or even consider alternative possibilities. They may refuse to accept any evidence that seems to disprove their positions and become aggressive and disruptive in the face of challenges. Sometimes, it’s possible to write off this unshakable dissenter as an oddball individual, well-known to the rest of the group as such. But in a collaborative process each person represents a specific interest and has an important role to play in reaching agreement. A careful response is needed to move dialogue in a productive direction.

A number of recent studies and explanations are helping to clarify some of the possible reasons for intransigence of this type. In doing so, they’re also bringing out the fact that such behavior is not so unusual. We’re living in a time of increasing polarization of views on politics and public policy, and it’s especially important to understand the tenacity of extreme opinion and its impacts on consensus building and other forms of collaborative work.

In this and the next few posts, I want to explore ideas about why people can hold so firmly to existing beliefs no matter what contradicting information they may have, and also about practical steps that facilitators and collaborative leaders can take in response. This post provides brief overviews of the ideas of Elizabeth Bader, drawn from psychoanalytic theories of human personality, and the recent work of Williams Eggers and John O’Leary (If We Can Put a Man on the Moon) that draws on neuroscience research for some of their conclusions about obstacles to problem-solving in government.

These ideas address different levels of group interaction and, taken together, offer extremely helpful guidance for finding effective strategies. They do this by focusing on the non-rational elements that pervade consensus building groups. As Robert Benjamin has pointed out, the interest-based, joint gains negotiation model assumes that rational analysis will be decisive in formulating agreements. The dynamism of group interactions, however, is much more complicated, and requires understanding of very human but quite non-rational behavior.

Personal Identity Issues

Elizabeth Bader, an attorney, mediator and psychologist, has recently offered a model of the mediation dynamic that draws on psychoanalytic theories. The essay at, The Psychology of Mediation: The Mediator’s Issues Of Self And Identity, summarizes the first part of her longer academic paper of the same name. It places the tendency to cling to pre-established positions and demands in the context of human personality while also offering specific strategies for managing the problem during mediation.

To simplify drastically, she describes the ways in which many individuals grow up with a damaged sense of self. Instead of dealing with people from an integral and secure self-awareness, they have vulnerabilities, perhaps shame, that drive them to strengthen their sense of identity externally in careers or relationships. This can lead to confusing one’s own identity with the integrity of a position they bring to the negotiating table. It’s not just an issue. It’s become part of who they are.

That reality can prompt them to present themselves to the group aggressively and with an exaggerated sense of their own power. Their positions may come across as absolute demands, and they can appear intractable from the outset. Considering alternative ideas for resolving an issue – one based on the needs of everyone in the group – is the last thing they’re interested in. That process might be too threatening to their personal identity and provoke aggressive and disruptive behavior.

Bader seens this phase of inflated expectations as the first part of a cycle that hopefully moves beyond this personality-based imbalance. The mediator, or perhaps interaction with other members of the group, needs to bring such a person around to see that meeting the initial expectations isn’t going to happen.

They have to separate themselves from their rigid position and look at the situation more realistically as one that includes the interests of many parties, all of whom have to be dealt with. Bader refers to this realization as deflation. It marks the moment of letting go the original demands and becoming open to new options. Resolution of the issues can then take place. She calls this the IDR cycle referring to these successive states of inflation, deflation and realistic resolution.

A critical role for the mediator or facilitator is to provide a supportive presence that helps a participant separate self-identity from the issues. An effective method is to mirror back to a participant the content of their communication and thus assure the individual of a respectful and understanding response. That support helps reduce the perception of threat from the group, a perception that often leads to aggressive and disruptive behavior.

To provide that support, mediators need to be sensitive to these tendencies in their own interactions with a group. Inevitably, some participants press a mediator’s buttons and stir reactions relating to purely personal emotional history. The important thing is to be able to recognize what’s happening, let go of those associations with personal identity and focus on the moment of the group’s needs. She recommends the practice of meditation and mindfulness to cultivate this ability.

Her paper is full of valuable insights – which I’ve only touched on here – and it’s an important resource for its practical advice. Personal identity, though, is only one important way of understanding resistance to collaborative decision-making.

Confirmation Bias

William Eggers and John O’Leary examine evidence from neuroscience that helps explain the blindness of highly rational people to information that contradicts a course of action they’ve already decided on. This is one part of their extraordinary study of government efforts to solve major problems, If We Can Put a Man on the Moon

They call it the Tolstoy Trap, the resistance to accepting or trusting information that disproves what people already firmly believe to be true. Their vivid writing reviews many failed government policies that resulted, at least in part, from the refusal to pay attention to evidence contradicting an action the leadership already believed in. These critical mistakes range from the Bay of Pigs invasion in the 1960s to price controls for fighting inflation in the 1970s, the Challenger disaster of the 1980s and many more.

Everyone uses a cognitive filter to sift through new information, often to find the facts to support what they believe. That’s why a consensus building group may find members drawing exactly opposite conclusions from the same data. Where one might find confirmation of the value of public health care services, another can find only evidence of bureaucratic blundering and inefficiency. One member may see proof for the perfect solution to a water supply problem, another may read in the same data a government invasion of private rights.

This is a phenomenon that’s been well known for centuries, but evidence from neuroscience has recently mapped the specific brain areas involved in this process of selective response – also called motivated reasoning. These studies bring out the role of emotion in shaping interpretation.

One study by Drew Westen and others (downloadable here at note 26) gave both positive and negative information about 2004 presidential candidates to committed partisans. As information was being evaluated, neuroimaging indicated that regions of the brain associated with rational analysis were inactive and only emotion-related areas were busy with the problem.

Partisans also applied disparate standards for evaluating information. They readily accepted supportive evidence for a favored candidate, but the threatening evidence favorable to the opponent was subjected to a much stricter test and easily discredited. The conclusion was that a distinctive form of reasoning comes into play when there is a strong emotional stake in the outcome. There is an emotional need to have an existing belief confirmed.

Like many mediators and facilitators, I’ve often heard the force of this emotional dimension when inflexible participants respond to technical information. The next question is what to do about it.

Most of the suggestions offered by Eggers and O’Leary are already used in interest-based consensus building and mediation. They recommend, for example, such steps as building agreement on data prior to consideration of issues, beefing up the option-generating process, and strict use of scientific principles in analyzing information. These are sound ideas but assume that more rational analysis will persuade people who are not responding to that mental model. They also suggest role plays that ask participants to advocate positions they oppose, and that seems more promising, as it gets to the evocative aspects of persuasion.

Another approach might be to start working with stakeholders before any proposals are formulated or collaborative groups are convened. As the Open Government websites at federal agencies are now doing, stakeholders and members of the public can be asked for ideas on possible approaches. A transparent process involving exchanges among stakeholders, including agency staff, could then be used to refine and synthesize the full range of suggestions into practical options. A diverse stakeholder group could be convened at that point strictly to review the appropriate data and technical methods that would be used to analyze the feasibility of each option. Recommending proposals or making decisions would thus be separated from these earlier phases. It’s the prospect of an imminent decision that immediately raises the stakes and intensifies the emotional propensity of advocates to interpret information in distorted ways.

There are a lot of other ideas about why people resist changing their views, and I’ll discuss more of them in the next post. After that, I’ll pull together a resource list of the methods suggested by each of these approaches.

What are the techniques you believe are most effective?


2 Responses to “ Consensus Building and the Unshakable Rightness of Belief ”

  1. Very interesting article. thanks for the mention. Our MA health care example is all about bringing stakeholders in early, before the ideas are fully baked. Looking forward to your next post

  2. Thank you!

    I hope to write more soon about your great new book.