Search the KHIT Blog

Wednesday, June 25, 2014

The "Talking Stick" and the three-legged stool of sustained, transformative healthcare QI

My recent posts have ruminated on what I see as the underappreciated necessity for focusing on the "psychosocial health" of the healthcare workforce as much as focusing on policy reform (e.g., P4P, ACOs, PCMH), and process QI tactics (e.g., Lean/PDSA, 6 Sigma, Agile), including the clinical QI Health IT-borne "predictive analytics" fruits of ""Evidence Based Medicine" (EBM) and "Comparative Effectiveness Research" (CER). Evidence of psychosocially dysfunctional healthcare organizational cultures is not difficult to find (a bit of a sad irony, actually). From the patient safety-inimical "Bully Culture" down to the "merely" enervating emotionally toxic, I place it squarely within Dr. Toussaint's "8th Waste" (misused talent). See my prior posts,
Add one more book to my accruing stash, Nicholas Epley's interesting "Mindwise: How We Understand What Others Think, Believe, Feel, and Want."

I finished "Mindwise" on my flight back to Oakland from Honolulu Monday afternoon. Thoroughly enjoyed it.

What is the "Talking Stick"?
A final challenge to getting perspective is that others’ words are unclear, leaving room for misinterpretation. The egocentric biases we discussed in chapter 5 make you believe you are communicating your thoughts, beliefs, attitudes, and instructions more clearly than you actually are. To really enable someone to understand what’s on your mind, you not only need to be clear, you need to be painfully clear. If you’re getting someone’s perspective, you not only need to listen, you need to verify that your understanding is correct. Native Americans reportedly had a method for doing this very thing, called the “talking stick.” When different tribes had a dispute, they would gather to discuss their differences. During these conversations, only the person holding the talking stick was allowed to speak. When that person finished, they would hand the stick to another elder, who would first have to reiterate the last speaker’s position to that person’s satisfaction. Only when the first person felt understood could the next person make a new point. The brilliance of this method comes not from its ability to enable speaking but, rather, from how it fosters listening. If you have to reiterate someone else’s point to their satisfaction, then you’ll find out if you’ve understood correctly or incorrectly.

Although the talking stick is commonly recommended on the public-speaking circuit, I have yet to see it used regularly in any modern household or organization...

Epley, Nicholas (2014-02-11). Mindwise: How We Understand What Others Think, Believe, Feel, and Want (Vintage) (pp. 180-181). Knopf Doubleday Publishing Group. Kindle Edition.
This goes to Covey's "seek first to understand, and then to be understood."

My first grad school course was that of "Argument Analysis." The methodology being taught was actually comprised of two equally important sequential components: [1] forthright, comprehensive "argument analysis," followed by [2] "equally forthright argument evaluation." You cannot effectively evaluate a proffer until you fully understand it.

Talking Stick methodology.

You had to choose a refereed mainstream journal article to deconstruct and evaluate. I chose the 1994 JAMA paper "A Better Quality Alternative: Single Payer National Health System Reform" (pdf). The analytical method entailed numbering every sub-argument premise-to-conclusion element (I went with paragraph/sentence-clause enumeration), and then "flow-charting" each to depict the logic flow.

Tedious, in a word, after 49 paragraphs of painstaking analytical attention to logic flow and meaning. But, only after having honestly and accurately done so ("seek first to understand") could you move on to effectively "evaluate" the relative merits of each claim and how much weight of evidence they provided to the main "argument"/assertion.
Read the dotted lines as "despite" or "notwithstanding," and the solid lines as "because"/"therefore" in the direction indicated by the arrows (the arrowheads indicate the "therefore" or "consequently" conclusion).
There has been a good bit of recent clamor for teaching "critical thinking" skills in the healthcare space. You can readily find numerous books on the subject. to wit:

CHAPTER 1. What Is Critical Thinking, Clinical Reasoning, and Clinical Judgment?

This chapter at a glance
Critical Thinking: Behind Every Healed Patient
Critical Thinking: Not Simply Being Critical
Rewards of Learning to Think Critically
How This Book Helps You Improve Thinking
    Brain-Based Learning
    Organized for Novices and Experts
What ’ s the Difference between Thinking and Critical Thinking?
Critical Thinking: Some Different Descriptions
    A Synonym: Reasoning
    Common Critical Thinking Descriptions
Critical Thinking, Clinical Reasoning, and Clinical Judgment
    Applied Definition
Problem-Focused Versus Outcome-Focused Thinking
What about Common Sense?
What Do Critical Thinkers Look Like?
Critical Thinking Indicators (CTIs)
What’s Familiar and What’s New?
    What’s Familiar
    What’s New
4-Circle CT Model: Get the Picture?
Thinking Ahead, Thinking-in-Action, Thinking Back (Reflecting)
Putting It All Together
Critical Thinking Exercises
Key Points/Summary
Having taught Critical Thinking and Argument Analysis at the collegiate level for a number of years (great fun), I've often thought that this might be an area wherein I could serve as a short course training consultant to healthcare organizations. I could teach undergrad Critical Thinking without ever once having to refer to a text or notes.

Below, my summary "logic molecule" illustration.

But, as my recent studies have pointed out to me anew, it's not just about the mechanics of applied rational reasoning -- deduction, induction, astute recognition of structural / syllogistic, rhetorical, and statistical fallacies, the lexical and semantic nuances of "definition," the "scientific method" and so forth. It's not just about me, or you. It's about us. i.e., it's equally about interpersonal relations and mutual perceptions -- organizational dynamics. It's about "culture."

It's about "Humble Inquiry," about being "Mindwise," about the nurturing of the mutual-accountability "Just Culture" necessary for a collegial, high-engagement, high-performance interdisciplinary team-based workforce.

All of which goes necessarily to "Leadership," as leaders are the only ones with the requisite authority -- the ones who ultimately set and enforce the tone of organizational culture for better or worse. "Critical thinkers" in a psychosocially toxic organization may well simply be seen as insubordinate troublemakers.

It's about authenticity at every level within an organization, and the nurturing of a healthy culture that supports it.
...nurturing of the mutual-accountability "Just Culture" necessary for a collegial, high-engagement, high-performance interdisciplinary team-based workforce.
High morale and engagement and openness to the ongoing rigors of process improvement and effective high-cognitive burden teamwork simply requires it.

Some random clips from the breadth of "Mindwise"

Arguably, your brain’s greatest skill is its ability to think about the minds of others in order to understand them better… the kind of mind reading you do intuitively every day of your life, dozens of times a day, when you infer what others are thinking, feeling, wanting, or intending. The kind that enables you to build and maintain the intimate relationships that make life worth living, to maintain a desired reputation in the eyes of others, to work effectively in teams, and to outwit and outlast your competitors…

During the day in between, you easily recognize that your employees are clueless, but are sure your boss thinks you’re brilliant. You sense that your coworker is lying when he calls in sick but are confident that your clients are honest when they claim to love your work...

Fortunately for improvement’s sake, the mistakes we make trying to understand the minds of others are predictable and therefore correctable. Our mistakes come from the two most basic questions that underlie any social interaction. First, does “it” have a mind? And second, what state is that other mind in?

We can make mistakes with the first question by failing to engage our mind-reading ability when we should, thereby failing to consider the mind of another and running the risk of treating him or her like a relatively mindless animal or object. These mistakes are at the heart of dehumanization. But we can also make mistakes by engaging our ability when we shouldn’t, thereby attributing a mind to something that is actually mindless. These mistakes are at the heart of anthropomorphism…

Once we’re trying to read the minds of others, we can make mistakes with the second question by misunderstanding others’ thoughts, beliefs, attitudes, or emotions, thereby misunderstanding what state another mind is in. Our most common mistakes come from excessive egocentrism, overreliance on stereotypes, and an all-to-easy assumption that others’ minds match their actions…

Our species has conquered the Earth because of our ability to understand the minds of others, not because of our opposable thumbs or handiness with tools.

In fact, this ability forms the backbone of all cooperative social life. This is why those with greater social sensitivity have stronger friendships, better marriages, and are happier with their lives in general. At work, leaders do better when they have some sense of whether or not their instructions are being understood...

[T]here can be a significant disconnect between what people think about themselves and how they actually behave...

Understanding the associative nature of your brain is absolutely essential for understanding why it’s hard to know some aspects of your own mind. Over the course of evolution, your genetic code has inherited certain associative networks that help keep you alive long enough for you to pass along your genetic material through children…These positive associations, however, can make it hard to know yourself accurately...

Decades before psychologists made any of these discoveries about the full reach of unconscious processes, Carl Jung said, “In each of us there is another whom we do not know.”… We don’t understand ourselves perfectly well because we have access to only part of what’s going on inside our heads...

The reason is that we introspect about our own minds in the same way we do about the minds of others: by using a theory that makes sense of our own behavior even when we lack direct access to the actual causes of it. It works quickly and automatically, and it simply doesn’t account for what you don’t know…

What’s surprising is how easily introspection makes us feel like we know what’s going on in our own heads, even when we don’t. We simply have little awareness that we’re spinning a story rather than reporting the facts...

The important point here is that the stories we tell about the workings of our minds rely on the same mind-reading abilities we use to make sense of the minds of other people… The only difference in the way we make sense of our own minds versus other people’s minds is that we know we’re guessing about the minds of others. The sense of privileged access you have to the actual workings of your own mind— to the causes and processes that guide your thoughts and behavior— appears to be an illusion...

Illusions matter not simply because they are interesting but because they are consequential. The ability to introspect—“ to feel ourselves thinking,” as William James put it— creates an illusion that we know our own minds more deeply than we actually do. This illusion has one disturbing consequence: it can make your mind appear superior to the minds of others...

[W]ithout awareness of all of the constructive processes going on in your brain that allow you to see color, it seems to you that color exists out there in the world, rather than inside your own head— that the color red is actually on the apple in front of you, rather than simply appearing red to you because of the magic done by your neural connections. This creates what psychologists refer to as naïve realism: the intuitive sense that we see the world out there as it actually is, rather than as it appears from our own perspective.

If a person thinks he or she sees the world as it actually is, then what happens when he or she meets someone who sees the world differently? When your friend tells you that the red apple is brown, who do you think needs to visit the eye doctor? Naïve realism suggests an answer: they do. It calls to mind a famous line of George Carlin’s: “Have you ever noticed that everyone driving slower than you is an idiot, and anyone going faster than you is a maniac?”

Arguing about the color of an apple or speed on the highway is relatively trivial, but arguing about abortion rights, religion, same-sex marriage, gun control, or any other important issue on which opinions diverge is a serious matter, with conflict inflamed by the fuel of naïve realism. If the illusions you hold about your own brain lead you to believe that you see the world as it actually is and you find that others see the world differently, then they must be the ones who are biased, distorted, uninformed, ignorant, unreasonable, or evil. Having these kinds of thoughts about the minds of others is what escalates differences of opinion into differences worth fighting (and sometimes dying) for…

A more accurate understanding first requires the recognition that your judgment could be wrong, or could at least be wrong more often than you might think...

Even doctors— those whose business is to treat others humanely— can remain disengaged from the minds of their patients, particularly when those patients are easily seen as different from the doctors themselves. Until the early 1990s, for instance, it was routine practice for infants to undergo surgery without anesthesia. Why? Because at the time, doctors did not believe that infants were able to experience pain, a fundamental capacity of the human mind. “How often we used to be reassured by more senior physicians that newborn infants cannot feel pain,” Dr. Mary Ellen Avery writes in the opening of Pain in Neonates. “Oh yes, they cry when restrained and during procedures, but ‘that is different.’” Doctors have long understood infants as human beings in the biological sense, but only in the last twenty years have they understood them as human beings in the psychological sense… Engaging with the mind of another person depends not only on the type of person you are but also on the context you are in...


Many African traditions speak of a concept known as ubuntu: “a person is a person through other persons.” Your humanity comes from the way you treat others, the idea goes, not the way you behave in isolation. Humanity comes from treating others as human beings, not in the biological sense of having a fully human body but in the psychological sense of having a fully human mind. I have spent the last fistful of pages explaining how good people like you and I can, under the right circumstances, remain disengaged from the minds of others and thereby treat them as relatively mindless. By failing to engage our capacity to understand the minds of other people, we not only become indifferent to them, we risk losing some of our own humanity.

But this is not a book about social justice; it is a book about social understanding. Engaging more directly with the minds of others can not only make you behave more humanely toward others, it can make you behave more intelligently in the presence of others as well...

Every business leader is charged with getting things done through people. This requires understanding what actually motivates people in their jobs. This is an obvious mind-reading problem: What do my employees really want?

Leaders have two kinds of incentives at their disposal: intrinsic and extrinsic. Intrinsic incentives are any inherent to the job itself, such as the pleasure of accomplishing something worthwhile, learning new things, developing skills, or feeling proud about your work. Extrinsic incentives are outcomes that are separable from the job itself, such as getting paid, earning fringe benefits, getting a bonus, or having job security. Notice that the effect of extrinsic incentives on other people can be observed directly because it involves an obvious exchange of goods for services, whereas the impact of intrinsic incentives can really only be felt and experienced on the inside. You can see that both you and others work harder when money is at stake, but the metrics of pride and meaning and a sense of self-worth are emotional states that you feel rather than see. As a result, you can recognize intrinsic motivations more easily in yourself than in others...

Galileo may have removed the Earth from the center of the universe, but every person on this planet is still at the center of his or her own universe. As Galileo knew, to see the world accurately, you need to look in the right place and then view it through the right lens. These are two pieces of wisdom that you and I can easily forget…

One consequence of being at the center of your own universe is that it’s easy to overestimate your importance in it, both for better and for worse.

Your own beliefs serve as a lens for understanding what others are likely to believe, as well as how strongly they are likely to believe it. But your mind contains multitudes, and beliefs are not the only lens that can alter your perceptions. Knowledge can also do it. For example, read the following sentence:


Now please go back and count how many f’s appear in that sentence. This is important. I’ll wait for you.

How many did you find? More than you can count on one hand? If not, then we have just confirmed that you are a terrific reader but a terrible counter. Try it again. Look harder. I’ll be patient. Found all six yet? Don’t forget that “of” has an f in it. See them all now? Most people who read this sentence fail to spot all six of the f’s on their first pass. Instead, most see only three.

Why so few? This example has nothing to do with your beliefs and everything to do with your knowledge. Your expertise with English blinds you from seeing some of the letters. You know how to read so well that you can hear the sounds of the letters as you read over them. From your expert perspective, every time you see the word “of” you hear a v rather than an f and, therefore, miss it. This is why first graders are more likely to find all six in this task than fifth graders, and why young children are likely to do better on this than you did as well. 30 Your expert ears are clouding your vision.

This example illustrates what psychologists refer to as the curse of knowledge, another textbook example of the lens problem. Knowledge is a curse because once you have it, you can’t imagine what it’s like not to possess it. You’ve seen other people cursed many times. For instance, while on vacation, have you ever tried to get driving directions from a local? Or talked to an IT person who can’t explain how to operate your computer without using impenetrable computer science jargon? In one experiment, expert cell phone users predicted it would take a novice, on average, only thirteen minutes to learn how to use a new cell phone. It actually took novices, on average, thirty-two minutes.

The expert’s problem is assuming that what’s so clear in his or her own mind is more obvious to others than it actually is. Each of us has unique areas of expertise, but we are all experts on one issue very near and dear to us: ourselves. We live, work, and sleep with ourselves every day. We know what we looked like in the morning, how we felt yesterday, what we were doing five years ago.

The problem of expertise is one of many examples of mistakes that come from projecting our own minds onto others: assuming that others know, think, believe, or feel as we do ourselves. Of course, we do not project ourselves onto others completely. We do so in some situations more than in others, and we project more onto some minds than others. The less we know about the mind of another, the more we use our own to fill in the blanks.

Every student in a negotiation class learns that the secret to solving disputes is recognizing that the other side may not have completely opposing interests, and may have more overlap in interests than you would guess. Solving disputes therefore requires openly discussing each others’ actual interests, identifying similarities, and then identifying integrative solutions that maximize the benefits for both sides…

Sadly, negotiations over our differences rarely end so sensibly. When groups are defined by their differences, conflict is fought over the differences we imagine, suppose, or expect from others rather than over the genuine, multifaceted, and often more moderate differences that actually exist. When groups are defined by their differences, people think they have less in common with people of other races or faiths or genders than they actually do and, as a result, avoid even talking with them. When groups are defined by their differences, the minds we imagine in others may be more extreme than the minds that actually encounter.

If there’s anything surprising left to learn about stereotypes, however, it’s how quickly we drop them as soon as we go from reasoning about the mind of a group to the mind of an individual whose behavior we observe directly...

...a common sense that a person’s mind corresponds directly to that person’s actions, a systematic sense that psychologists refer to as the correspondence bias...

Failing to calibrate our sixth sense to recognize the power of the broader context can create considerable misunderstanding, from assuming that accidents were intentional to crediting people for successes beyond their control… Common sense suggests targeting people’s minds to change their actions, but many of these solutions are useless because they misunderstand the cause of the problems.

Egocentrism exaggerates the extent to which others’ minds match one’s own. Stereotypes can highlight differences at the expense of similarity. And others’ actions can prompt oversimplified assumptions about the minds behind them. These heuristics provide simple shortcuts for understanding the minds of others, but they come at the cost of oversimplifying them. Others’ minds are more complicated than your sixth sense often suggests.


In How to Win Friends and Influence People, one of the best-selling books of all time, Dale Carnegie lists a series of principles for how to do what his title promises. Principle 8, he writes, is a “formula that will work wonders for you.” The formula? “Try honestly to see things from the other person’s point of view.”...

The weakness of perspective taking is also obvious: it relies on your ability to imagine, or take, the other person’s perspective accurately. If you don’t really know what it’s like to be poor, in pain, suicidally depressed, at the bottom of your corporate ladder, on the receiving end of waterboarding, in the throes of solitary confinement, or to have your source of income soaked in oil, then the mental gymnastics of putting yourself in someone else’s shoes isn’t going to make you any more accurate. In fact, it might even decrease your accuracy… Overthinking someone’s emotional expression or inner intentions when there is little else to go on might introduce more error than insight. What’s more problematic is that if your belief about the other side’s perspective is mistaken, then carefully considering that person’s perspective will only magnify the mistake’s consequences. This is particularly likely in conflict, where members of opposing sides tend to have inaccurate views about each other...

We’ve now looked many times for evidence that perspective taking— actively trying to imagine being in another person’s circumstances— systematically increases mind reading and have yet to find any supportive evidence… The main issue is that carefully considering another’s perspective is no guarantee that you’ll be able to do it accurately...


Recognizing the limits of your sixth sense suggests a different approach to understanding the minds of others: trying harder to get another person’s perspective instead of trying to take it. As the old reminder to doctors trying to understand their patients goes, “The patient is trying to tell you what’s wrong with him. You have to shut up and listen.”

Consider an example of how perspective getting might work. In 1993, the U.S. government signed the “don’t ask, don’t tell” policy into law, banning gays and lesbians from serving openly in the military. By 2010, the Obama administration was considering the consequences of repealing the law. Moral implications aside, knowing how current soldiers felt about this repeal was essential for assessing its practical consequences. This is a textbook mind-reading problem, with at least two approaches to solving it.

One is exemplified by the 1,167 retired military officers who used their perspective-taking ability to imagine the consequences for current soldiers of repealing the law. In an open letter to President Obama and members of Congress, they expressed their strong opposition. “Our past experience as military leaders,” they wrote, “leads us to be greatly concerned about the impact of repeal on morale, discipline, unit cohesion, and overall military readiness. We believe that imposing this burden on our men and women in uniform would  …   eventually break the All-Volunteer Force.” Elaine Donnelly, president of the Center for Military Readiness, argued that this opposition must be taken very seriously. “They have a lot of military experience,” she said, “and they know what they’re talking about.”

The Pentagon took a second approach to this mind-reading problem. Its officials asked the soldiers their opinions directly by surveying 115,052 soldiers and 44,266 of their spouses in one of the largest studies in military history. The soldiers themselves expressed relatively few concerns. In fact, 70 percent believed that the repeal would have no effect or a positive effect on the military. More telling, roughly the same number (69 percent) said that they had worked with a gay service member already. Among those, 92 percent said it had no effect or a positive effect on the unit’s ability to work together. From these responses, Defense Secretary Robert Gates concluded that the repeal “would not be the wrenching dramatic change that many have feared and predicted.” Gates pushed for its repeal.

Who was right? In 2012, one year after the actual repeal of “don’t ask, don’t tell,” the military released a study of its consequences. The answer was clear: soldiers could speak their minds when asked directly, but the retired officers who’d imagined the soldiers’ reactions were wrong. The title of the press release says it best: “First Study of Openly Gay Military Service Finds ‘Non-Event’ at One-Year Mark.” Getting the soldiers’ perspective by asking them for it enabled understanding.

We communicate the contents of our minds primarily through language. As Daniel Gilbert writes in Stumbling on Happiness, “If you were to write down every thing you know and then go back through the list and make a check mark next to the things you know only because someone told you, you’d develop a repetitive-motion disorder because nearly everything you know is secondhand.”  This is why William Ickes, an expert on empathic accuracy, finds that “the best predictor [so far] of empathic accuracy appears to be verbal intelligence.” Knowing others’ minds requires asking and listening, not just reading and guessing...

The gains that come from getting perspective directly instead of guessing about someone’s perspective can be big...

Months before an explosion on the Deepwater Horizon oil rig caused the largest spill in history, a confidential survey of rig workers uncovered serious safety concerns but strong fears of reprisals for reporting those concerns. According to a report in the New York Times, “only about half of the workers interviewed said they felt they could report actions leading to a potentially ‘risky’ situation without reprisal.” One worker surveyed said, “The company is always using fear tactics. All these games and your mind gets tired.” To hold on to their jobs and avoid punishment, workers routinely kept quiet about obvious risks, even faking data in the company’s safety system to make the rig appear more stable than they knew it was. Under these conditions, the rig passed an internal safety inspection just one month before the disaster. Would this disaster have been averted if the company’s executives had been willing to hear what their employees knew? I’d bet on it.

Doctors have even discovered that opening up their minds and admitting their mistakes can actually reduce one of their biggest fears: litigation. In 2001, the University of Michigan Hospitals began a medical-error-disclosure program in which doctors openly admit their medical mistakes in meetings with patients, explain what led to the mistake, and then offer fair compensation. Compared to the no-disclosure policy practice in the six preceding years, this open-apology system cut malpractice lawsuits in half (from 39 per year to 17 per year) and reduced the time to resolution by roughly 30 percent (from 1.36 years to .95 years). According to the lead doctor reporting these results, “Everybody worries that disclosure will lead to liability going through the roof, but here’s one institution that set up their disclosure program privately and independently, helped their patients avoid using the courts and tort system, and did not sustain the skyrocketing claims and costs that others might have predicted.” In fact, this program actually reduced overall liability costs by roughly 60 percent. The bigger problem had been requiring patients to imagine what their doctors were thinking, or having to sue to find out, rather than just allowing doctors to explain how a mistake happened.

Reducing litigation is good, but what the disclosure program really does, according to the medical center’s chief risk officer, Richard Boothman, “is give permission to doctors and other caregivers to do what’s important and what they want to do— take care of the patients and make sure the same error doesn’t ever happen again in the future.… When you break that paradigm of litigation and give patients the chance to understand the human element of the other side— of the doctor and what they are struggling with— you find that people are far more forgiving and understanding than has been typically assumed.” Now, that’s something your sixth sense might never have imagined...

Managers know what their employees think when they are open to the answers and employees feel safe from retaliation, not when managers use their intuition… If we want to understand what’s on the mind of another, the best our mortal senses can do may be to rely on our ears more than our inferences… Knowing the limits of our brain’s social sense does not always mean that we can overcome them to understand others better. Sometimes a sense of humility is the best our wise minds can offer, recognizing that there’s more to the mind of another person than we may ever imagine...

But even our greatest abilities are far from perfect, and our sixth sense’s mistakes also cause some of our life’s greatest pains. Broken relationships, failed organizations, stalled careers, and needless conflicts are common casualties… When we’re indifferent to others, it’s easy to overlook their minds altogether, treating such people more as relatively mindless animals or objects than as fully mindful persons… If understanding is your goal, then you know how to do much better.

Only by recognizing the limits of our brain’s greatest sense will we have the humility to understand others as they actually are instead of as we imagine them to be.
Indeed. Note from the foregoing the nuanced take on "empathy." You cannot simply intuit, infer others' states of mind, others' motives, thoughts, feelings, and needs. You mostly have to ask to get it right. Which goes to Humble Inquiry (Schein) and "Help Them Grow..." (Kaye & Winkle-Giuglioni). Openness and beneficent motives, while certainly necessary, are not enough. You need Leadership-nurtured Just Culture for a thriving high-engagement healthcare workforce. The high cognitive burden workflow processes (including those of health IT) and the demands of complex, rapid decision dx and px/tx are unlikely to get any simpler or easier.

In sum, I repeat: The stool needs three strong legs. One of them being a psychosocially healthy workforce.

A psychosocially healthy workplace is a significant profitability and sustainability differentiator.


Well, the full article is firewalled. You can get at it via registration with The JAMA Network Reader, but the text there is not copyable. Suffice it to say that this is one small, weak, equivocal study. It was conducted across a 3 month period in 2012 at Brigham and Women's and their affiliated ambulatory clinics.

Of course, the red meat headlines will blare that "MU does not improve quality and wastes taxpayer dollars." This "study" does nothing whatsoever to buttress that claim.

More to come...


  1. Listen more than ask: I think that's at the core of what you express in the post and those 4 words are my motto, borrowed from the research director for Procter & Gamble. This huge consumer products company no longer conducts surveys or focus groups but instead listens to the naturally-occuring conversations that consumers are having online. They stay competitive by listening. I love that and mention it every chance I get -- to policymakers, entrepreneurs, corporate leaders -- everyone.

  2. Great information - thanks a lot for the detailed article! You might also be interested to know more about our company.Ambulatory Outpatient Services Director Email List
    Techno Data Group helps you achieve your business goals through customized database, connect with interested qualified prospects who are seeking information to move forward in the buying process.