Search the KHIT Blog

Monday, November 21, 2016

Clinical workflow, clinical cognition, and The Distracted Mind

Finished the Gazzaley-Rosen book. It is excellent. Put a link in my permanent right-hand links column. You should get The Distracted Mind and read it.


I find The Distracted Mind directly analogous to Dan Lieberman's also excellent book The Story of the Human Body, which I cited earlier in the year.


"Evolutionary mismatch" ailments, both physical and neurological/cognitive.

Below, another book I've begun that may have some relevant "Distracted Mind" implications.

The neuroscience of attention, despite having greatly advanced over the past few decades, remains too primitive to explain comprehensively the large-scale harvesting of attention. At most it can shed light on aspects of individual attention. But there is one thing scientists have grasped that is absolutely essential to understand about the human brain before we go any further: our incredible, magnificent power to ignore...

Wu, Tim (2016-10-18). The Attention Merchants: The Epic Scramble to Get Inside Our Heads (p. 19). Knopf Doubleday Publishing Group. Kindle Edition.
__

Among other things with respect to The Distracted Mind, I've been looking for any useful implications for clinical workflow. apropos of that topic, the latest from Dr. Jerome Carter of EHRscience.com.
Workarounds, Disruptions, and Electronic Health Records

HITECH EHR incentives have been successful in driving EHR adoption. However, as more hospitals and practices have embraced HIT, the number of complaints of poor usability, workflow disruptions, and decreased productivity have grown. As a result, EHR systems have been one of the most important factors in bringing discussions of clinical workflow to the forefront. Of course, this does not mean that inefficient workflows did not exist prior to EHR systems, only that EHR systems provided sufficient contrast with known processes so that the differences became obvious.

Every clinical organization has policies and procedures that guide work activities. Strict adherence is rarely enforced, which gives those charged with carrying out said polices/procedures significant leeway in determining how they are done. If the form is filled out correctly, no one sweats the details. Electronic systems change this dynamic as they are supposed to “help.” As it turns out, the degree to which they help or hurt varies considerably. Some processes are more efficient, some less. Published studies on workarounds provide valuable information on how processes are affected by the presence of electronic systems…

While the research on workarounds reported in these papers is in reaction to EHR systems, its value goes beyond understanding EHR-related changes. Ultimately, the attention paid to common processes may prove to be more valuable. Why? EHR systems are designed to be patient information repositories, not clinical care assistants. As a result, supporting clinical work is seen as a data availability problem, not a process support problem. The underlying assumption is that providing data is the same thing as supporting processes. Workaround research demonstrates just how wrong this assumption is. Workarounds are workflow issues. Every workaround is an alternate path to the same goal.

Workflows consist of a series of steps and each step consumes or produces information, uses resources, and is performed by someone or something. If both groups of authors had rendered their findings in a formal process language (e.g., YAWL, BPMN, Colored Petri Nets) using acknowledged workflow patterns, their findings would have been easier to compare and possibly apply (say for software design).

As with workarounds, usability research has also exploded with EHR adoption. The most obvious usability issue with EHRs—they are designed to provide data, not support processes — remains underappreciated. Thankfully, this is slowly changing...
That's from the sample chapter of Dr. Carter's forthcoming book.


What of "workflow"? I've cited Dr. Carter's work numerous times here. See, e.g., "Announcing the launch of Dr. Jerome Carter's Clinical Workflow Center." See also "Clinical workflow: "YAWL," y'all?"

Uber-geek Chuck Webster is never at a loss for answers. From a comment he left at EHR Science last month:
Chuck Webster, MD, MSIE, MSIS @wareflo October 10, 2016 at 10:07 AM
I like my definitions of workflow and workflow technology:

“I’ve looked at literally hundreds of definitions of workflow, all the way from a “series of tasks” to definitions that’d sprawl across several presentation slides. The one I’ve settled on is this:

“Workflow is a series of tasks, consuming resources, achieving goals.”

Short enough to tweet, which is why I like it, but long enough to address two important concepts: resources (costs) and goals (benefits).

So what is workflow technology? Workflow technology uses models of work to automate processes and support human workflows. These models can be understood, edited, improved, and even created, by humans who are not, themselves, programmers. These models can be executed, monitored, and even systematically improved by computer programs, variously called workflow management systems, business process management suites, and, for ad hoc workflows, case management systems.

Workflow tech, like health IT itself, is a vast and varied continent. As an industry, worldwide, it’s probably less than a tenth size of health IT, but it’s also growing at two or three times the rate. And, as both industries grow, they increasingly overlap. Health IT increasingly represents workflows and executes them with workflow engines. Workflow tech vendors increasingly aim at healthcare to sell a wide variety of workflow solutions, from embeddable workflow engines to sprawling business process management suites. Workflow vendors strenuously compete and debate on finer points of philosophy about how best automate and support work. Many of these finer points are directly relevant to workflow problems plaguing healthcare and health IT.

Why is workflow tech important to health IT? Because it can do what is missing, but sorely needed, in traditional health IT, including electronic health records (EHRs). Most EHRs and health IT systems essentially hard-code workflow. By “hard code” I mean that any series of tasks is implicitly represented by Java and C# and MUMPS if-then and case statements. Changes to workflow require changes to underlying code. This requires programmers who understand Java and C# and MUMPS. Changes cause errors...

Process-aware tech, in comparison to hardcoded workflows, is an architectural paradigm shift for health IT. It has far reaching implications for interoperability, usability, safety, and population health.

BPM systems are ideal candidates to tie together disparate systems and technologies. Users experience more usable workflows because workflows are represented so humans can understand and change then. Process-aware information systems are safer for many reasons, but particularly because they can represent and compensate for the interruptions that cause so many medical errors. Finally, BPM platforms are the right platforms to tie together accountable care organization IT systems and to drive specific, appropriate, timely action to provider and patient point-of-care.”
“Workflow is a series of tasks, consuming resources, achieving goals.”

Ummm... "blinding glimpse of the obvious?"

One of my old summary WKFL takes here, from my MU days with HealthInsight: "Workflow Demystified" (pdf).

 BTW, I have to take a bit of issue with this Chuck Webster assertion from above:
Most EHRs and health IT systems essentially hard-code workflow. By “hard code” I mean that any series of tasks is implicitly represented by Java and C# and MUMPS if-then and case statements. Changes to workflow require changes to underlying code.
That may be true for some really old "legacy" EHRs, but it's a bit of overstatement in general. Yes, technically, if you want to change the layout order of appearance (or other aesthetic attributes) of various data cells on any given EHR tab or template, you might have to go "under the hood" and re-write the source code (such was in case in my ancien lab apps coding days in Oak Ridge in the 80's).


Ugh. Wrote about that here (pdf), published in the EPA Conference Proceedings the and journal Radioactivity and Radiochemistry.

You"might have to" (alter source code). Newer software systems of all types now typically have built-in click/drag-and-drop/cut & paste functionality wherein the user authoring/editing interface serves as a "source code generator" (like, well, this very blogger.com authoring app I use for this and other blog postings. And, yeah. sometimes I have to go in and edit the html code here when something doesn't present as intended, given the bugs in the platform. But, rarely).

Beyond that, the actual workflows, in terms of getting a user to the intended data entry/review/update targets, are independent of the exact screen placement of individual data cells on a given screen.
To this point, once the end-user is adequately trained up past the "learning curve" on a given system, the precise screen location of individual data cells becomes a fairly trivial facet of workflow relative to the resources consumption of getting there and then entering data and moving on.
When I was working in Meaningful Use at my REC, every ONC-certified EHR system (of the 15 or so in my personal MU client caseload, anyway) had multiple keystroke/mouse-click paths via which to direct the user to each MU documentation compliance criterion. The workflow modification task, then, was that of minimizing path travel, greasing the workflow skids.
And, I used to grouse that all repetitive MU data destination paths (there aren't that many) could/should be "one-stroke" -- i.e., via a set of "macros" (common to virtually all major modern commercial software). In fact, I'd have made one-stroke/click MU target functionality a condition of EHR MU Certification.

Anyone recall the old Windows "recorder.exe" utility? Microsoft killed it; nowadays you have to buy (relatively inexpensive) 3rd party macro utility apps. Still, given the workflow enhancement functionality, they're worth it in any "productivity treadmill" environment, particularly those of the irreducibly high cognitive burden medical clinic and hospital.
 __

BACK TO THE DISTRACTED MIND


The Gazzaley-Rosen book is comprised of three thematic sections spanning its eleven chapters: [1] what neuroscience has learned about the evolution and functionality of the brain, including the central adaptive elements of cognition [2] the increasingly adverse impacts of our 24/7 (mostly now digital) information overload ("information foraging") first-world culture, and [3] what, if anything, can be done to mitigate or otherwise counter those adverse impacts.
The Distracted Mind is not a pseudo-science book that offers colorful brain scans and questionable neuroscience as a way of making a topic appear more credible. In this book, we apply our complementary scientific lenses to present timely and practical insights. Dr. Adam Gazzaley is a cognitive neuroscientist and a trailblazer in the study of how the brain manages distractions and interruptions. Dr. Larry Rosen is a psychologist who has studied the “psychology of technology” as a pioneer in this field for more than thirty years. Our complementary perspectives focus on demonstrating why we fail to successfully navigate our modern technological ecosystem and how that has detrimentally affected our safety, cognition, education, workplace, and our relationships with family and friends. We enrich this discussion with our own research and scientific hypotheses, as well as views of other scholars in the field, to explain why our brains struggle to keep up with demands of communication and information. 

We present our perspectives in three parts. In Part I, we will take you on a tour through new insights into why our “interference dilemma” exists in the first place and why it has become so relevant to us now. We describe how the very essence of what has evolved furthest in our brains to make us human—our ability to set high-level goals for ourselves—collides headfirst with our brain’s fundamental limitations in cognitive control: attention, working memory, and goal management. This collision results in our extreme sensitivity to goal interference from both distractions by irrelevant information and interruptions by attempted multitasking. This noise degrades our perceptions, influences our language, hinders effective decision making, and derails our ability to capture and recall detailed memories of life events. The negative impact is even greater for those of us with undeveloped or impaired cognitive control, such as children, teens, and older adults as well as many clinical populations. We further discuss why we engage in high-interference-inducing behaviors from an evolutionary perspective, such that we are merely acting in an optimal manner to satisfy our innate drive as information-seeking creatures.

In Part II, we will share a careful assessment of our real-world behaviors and demonstrate how the collision described in Part I has been intensified by our constant immersion with the rich landscape of modern information technology. People do not sit and enjoy a meal with friends and family without checking their phones constantly. We no longer stand idle in waiting lines, immersed in thought or interacting with those next to us. Instead, we stare face down into virtual worlds beckoning us through our smartphones. We find ourselves dividing our limited attention across complex demands that often deserve sustained, singular focus and deeper thought. We will share our views of why we behave in such a manner, even if we are aware of its detrimental effects. Building a new model inspired by optimal foraging theory we explain how our high-tech world perpetuates this behavior by offering us greater accessibility to feed our instinctive drive for information as well as influencing powerful internal factors, such as boredom and anxiety. We are most certainly ancient brains in a high-tech world. 

Finally, in Part III we offer our perspectives on how we can change our brains to make them more resilient, as well as how we can change our behavior via strategies to allow us to thrive in all areas of our lives. We first explore the full landscape of potential approaches available to us—from the low-tech to the high-tech—that harness our brain’s plasticity to strengthen our Distracted Mind. This in-depth examination includes traditional education, cognitive training, video games, pharmaceuticals, physical exercise, meditation, nature exposure, neurofeedback, and brain stimulation, illustrating how in these fascinating times the same technologies that aggravate the Distracted Mind can be flipped around to offer remediation. We then share advice on what we can do from a strategic perspective to modify our behavior, without abandoning modern technology, such that we minimize the negative consequences of having a Distracted Mind. Using the optimal foraging model introduced earlier in the book as a framework to approach behavioral change, all of the strategies we offer are practical and backed by solid science.
The Distracted Mind will enlighten you as to how and why our brains struggle to manage a constantly surging river of information in a world of unending interruptions and enticements to switch our focus. We extend this perspective to explore the consequences of this overload on our ability to function successfully in our personal lives, on the road, in classrooms, and in the workplace, and address why it is that we behave in this way. Critically, we provide solid, down-to-earth advice on what we need to do in order to survive and thrive in the information age.

Gazzaley, Adam; Rosen, Larry D. (2016-09-16). The Distracted Mind: Ancient Brains in a High-Tech World, locations 117-151 (MIT Press). The MIT Press. Kindle Edition.
They seriously deliver. Actual neuroscience, in lieu of "neurobabble neurotainment."

Distracted Mind stuff that goes specifically to the workplace and "workflow" -
WORKPLACE INTERFERENCE 
For those of us who work with technology and are surrounded by other employees working with their technologies, interference has become the norm. We are constantly interrupted by others dropping by our desk to chat or attempting to connect with us through a variety of technological communication modalities, including the most popular workplace tool—email. A study by Judy Wajcman, a sociology professor at the London School of Economics, highlighted this phenomenon by shadowing eighteen employees of an Australian telecommunications company during their entire workday. Wajcman selected this company because it was designed to facilitate interactions between workers with open-plan offices and other external distractors, including many large television screens mounted around the office. The employees in this study spent only half their workday on actual “work episodes,” which included any and all work-related activities. Strikingly, most of these work episodes lasted ten minutes or less, with an average of just three minutes per work episode. And even more interesting, nearly two-thirds of the work episode interruptions were self-generated, and most of those involved some form of mediated communication using a technological device. In fact, of the approximately eighty-six daily changes in an employee’s work activity, the workers themselves generated sixty-five of them internally, with the vast majority involving “checking in” with no obvious external alert or notification. Even without the “You’ve Got Mail” notification, these workers checked their email anyway and continued to check other sources of electronic communication and information without being externally directed to do so. 

Whether directed externally via an alert or notification or internally by an unseen process, it appears that in the work environment email and other communication modalities bear a major responsibility for interruptions. One field study that followed workers for two weeks discovered that they were interrupted 4.28 times per hour by email and an additional 3.21 times by instant message communications. And these communications appeared to have a strong draw for the employees, since 41 percent of them responded to the email immediately and 71 percent responded to an instant message immediately. On average, the workers spent ten minutes dealing with the alerts and then took an additional ten to fifteen minutes to return to their appointed task, often visiting several other applications in the interim. Another study by the research group ClearContext indicated that more than half of the 250 workers they queried spent over two hours a day reading and responding to email. A study out of Loughborough University in England found that after dealing with an email, which itself took an average of just under two minutes, it took the studied workers an average of 68 seconds—more than half of the time required to read and respond to that email—to return to their work and remember what they were doing. This study also found that people are responding like Pavlov’s dogs to incoming email communication, waiting only an average of one minute and forty-four seconds to open that message. Strikingly, 70 percent of those alerts were attended to within six seconds, which is about the time it takes a phone to ring three times. And yet another study found that even without an alert, while one in three people claimed to check their email every fifteen minutes, they actually checked it about every five minutes. We are self-interrupting and not even aware of how often we are diverting our attention from our main task—in this case, our job—to another task that may be completely unrelated to work... [ibid, pp. 113-115]
We all mostly have a naive conceit that we are adroit "multitaskers." Reading this book should change our minds. The science says otherwise.


All very interesting, all of it. I would make this book required reading for Health IT "UCD/UX" designers (the "User Centered Design" / "User Experience" peeps).  I would also add clinical management to the "required readers" list, given that the implications go squarely to policy, not InfoTech UX design. A progressive, workflow-friendly EHR that comes out of the development oven only to get deployed in a high interruption/distraction work environment way well have its slick UX negated anyway.

Another aspect to bear in mind. It has become widely fashionable these days to swax rhapsodic over our putatively increasingly "post-EHR" world, a world dominated by "mHealth" (mobile health apps), with patients busily burying their docs with user-generated metrics (of wildly variant SOAP utility and data QA pedigrees) and random patient queries.

More interruptions and distractions (beyond the intractable "interoperability" issues). Just what clinicians need and want.

From the closing chapter of The Distracted Mind:
11 MODIFYING BEHAVIOR 
IT SHOULD NOW be abundantly clear that we live amidst a level of high-tech interference that in the past decade or so has dramatically changed the world, and along with it our thoughts, feelings, and behaviors. In Part II we explored the many ways in which modern technology has aggravated our Distracted Mind; from awakening in the morning until trying to fall asleep at night, we are tempted by technological distractions and interruptions. As we have shown, three main game changers—the Internet, smartphones, and social media—have forever altered our mental landscape. We have painted a detailed picture based on solid research from a variety of fields showing that we are spending our days switching from task to task and affording each only our divided attention. 

Recall the cognitive control limitations that we presented in chapter 5 in the domains of attention (selectivity, distribution, sustainability, processing speed), working memory (capacity, fidelity), and goal management (multitasking, task switching). As described, high-tech influences stress these limitations in just about every possible way: they challenge our attention abilities via frequent distractions, fragment our working memory and diminish its fidelity through interruptions, and drive us to excessive multitasking and task switching, all of which introduce performance costs. In terms of the MVT model, introduced in Part I and elucidated in chapter 9, modern technology has caused this by diminishing the time in which we remain engaged with an information source, causing us to shift to another patch before we have exhausted the information in our current source. We are like a squirrel with an attention disorder, constantly jumping from tree to tree, sampling a few tasty morsels and leaving many more behind as he jumps to the next tree, and the next and the next. It sounds exhausting, and, as we have shown, it is negatively affecting our safety, relationships, school and job performance, and mental health... [ibid, pp. 213-214]
CODA

From the final passages of "The Attention Merchants."
The past half century has been an age of unprecedented individualism, allowing us to live in all sorts of ways that were not possible before. The power we have been given to construct our attentional lives is an underappreciated example. Even while waiting for the dentist, we have the world at our finger tips: we can check mail, browse our favorite sites, play games, and watch movies, where once we had to content ourselves with a stack of old magazines. But with the new horizon of possibilities has also come the erosion of private life’s perimeter. And so it is a bit of a paradox that in having so thoroughly individualized our attentional lives we should wind up being less ourselves and more in thrall to our various media and devices. Without express consent, most of us have passively opened ourselves up to the commercial exploitation of our attention just about anywhere and anytime. If there is to be some scheme of zoning to stem this sprawl, it will need to be mostly an act of will on the part of the individual. 

What is called for might be termed a human reclamation project... [Wu, Tim op cit, location 6471]
____________

More to come...

1 comment:

  1. “Workflow is a series of inputs, tasks, decisions, and outcomes consuming resources, that might achieve goals.”

    ReplyDelete