The psychologist Carl Rogers
observed 60 years ago that a major barrier to effective communication was the natural tendency to evaluate and judge the words and actions of others.
He saw this happening not just on an individual level but also in public policy collaboration where those at the table usually represent institutions and communities rather than personal interests. Miscommunication caused by quick judgments and reactions occurs in this context as well simply because the evaluative process is such an ingrained human habit of thought.
Consider this example: Fred has decided to participate in a collaborative process but is very skeptical and suspicious about getting involved. He doesn’t trust the intentions of the convening organizations. When he arrives at the first meeting, he sees a poster-sized set of bullet points among other displays. On it, he reads one phrase that confirms his suspicions that the lead organizations have already decided a major policy issue.
He sits through the meeting saying nothing but not believing a word of what he hears about the collaboration process itself. He knows it’s a sham. At the end, he leaves and decides that if he doesn’t get a satisfactory explanation at the next meeting, he’ll walk out. A friend of his offers a different explanation of the poster, but Fred brushes that idea aside, saying, “I know what I saw.”
In fact, Fred was mistaken. But what exactly was the process he went through in forming his opinion, and what can be done to correct it?
The process leading to the judgment and reaction is a little more complicated than it appears at first glance.
What the Mind is Doing
Even though we usually evaluate, judge and react instantaneously, there are several steps that can be broken out. Each one offers an opportunity for intervention. Consider what probably ran so rapidly through Fred’s mind.
First he perceived something: the bullet phrase on the poster.
Then he assigned a meaning to what he read. To do this, he interpreted the phrase as the core of a policy statement by filling in the rest of the sentence. That is, he perceived what he expected to see by fitting the words into a familiar pattern. (We do this all the time. For example, the mind takes a few bits of visual data, completes a pattern and “sees” a friend while looking at the half-visible face of a stranger.)
Fred at once made a judgment about the intent and motive behind the “policy statement.” The sponsors of the process had already made up their minds. The idea of collaboration was a sham.
He then react internally by feeling alienated, on the one hand, but also vindicated in his suspicion and distrust about the process.
Then he came up with a response to what he “knew.” He said nothing to the group, turned his mind off during the rest of the meeting and decided what he would do to expose what was “really” happening.
How to Intervene
If Bill had said something at the meeting, it would have been possible to explain his mistaken assumption and to try changing his opinion. Since he was silent, it took a while for the facilitator and others to understand what had happened and then respond. But whether the intervention can be done on the spot as the judgment is made or sometime later, the steps to persuade someone to reconsider a judgment are similar.
The technique is to work with the participant to check the accuracy of each step leading to the formation of a reactive action or strategy.
Ask the person to describe the perception – in this case just the words on the poster. Check to see if that is factually accurate – were those actually the words? In this case, the perception was accurate.
Ask for clarification of the interpretation assigned to those words. In this case, Fred had filled in some missing words to create a sentence that confirmed his suspicion. Does that check out? Taken in context of the rest of the poster, the phrase was meant to be descriptive of the present situation, not prescriptive of future policy. (It’s important to continue even though the error has been identified because Fred is still suspicious.)
Ask about the judgment reached – in this case, that the policy had already been decided by the conveners. That doesn’t check out either: No one had decided any policy. That was up to the group.
Ask what the reaction was – Based on the faulty judgment, Fred now assumed this was a group with a predetermined outcome. In fact, this was not the case, though Fred will probably not drop his suspicions just yet.
Ask what his action in response will be. In this case, his demand for a full explanation at the next meeting turned out to be a good idea. That would help other participants who might have walked away with the same interpretation and judgment to hear and discuss what the poster had actually been intended to mean.
This model and technique, like any, can’t be used in all cases, but I’ve found it useful for giving guidance to a group – not by lecture but by application – on the care they should exercise before jumping to conclusions. It can be helpful early on in a process to pick a minor exchange about something of no great consequence to point out how rapidly judgment can be formed on the basis of incomplete data or faulty interpretation. That’s one way of alerting participants about how effective communication can be derailed and how they might keep it from happening by remembering the basic rule: Check it out first.