NOW REPORTING FROM BALTIMORE. An eclectic, iconoclastic, independent, private, non-commercial blog begun in 2010 in support of the federal Meaningful Use REC initiative, and Health IT and Heathcare improvement more broadly. Moving now toward important broader STEM and societal/ethics topics. Formerly known as "The REC Blog." Best viewed with Safari, FireFox, or Chrome.
NOTES, the Adobe Flash plugin is no longer supported. Comments are moderated, thanks to trolls.
You really owe it to yourself to listen carefully to this entire 76 minute podcast.
"In this episode of the Waking Up podcast, Sam Harris speaks with computer scientist Stuart Russell about the challenge of building artificial intelligence that is compatible with human well-being."
I've posted here on "AI vs IA" topics before. At about 61 minutes in this podcast interview they discuss issues reminding me of the postcapitalism "Four Futures" speculative alternative scenarios articulated by Peter Frase. BTW: this excellent essay has finally been released in expanded book format.
"Peter Frase argues that increasing automation and a growing scarcity of resources, thanks to climate change, will bring it all tumbling down. In Four Futures, Frase imagines how this post-capitalist world might look, deploying the tools of both social science and speculative fiction to explore what communism, rentism, socialism and exterminism might actually entail.
Could the current rise of real-life robocops usher in a world that resembles Ender’s Game? And sure, communism will bring an end to material scarcities and inequalities of wealth—but there’s no guarantee that social hierarchies, governed by an economy of “likes,” wouldn’t rise to take their place. A whirlwind tour through science fiction, social theory and the new technologies already shaping our lives, Four Futures is a balance sheet of the socialisms we may reach if a resurgent Left is successful, and the barbarisms we may be consigned to if those movements fail."
Again, listen to Sam's podcast carefully, and at least read the online Peter Frase essay. In the Health IT world these days we're all infatuated with stuff like Watson "curing cancer" and other AI/IA "personalized medicine" proffers. But, the very real and very serious sociological issues pertaining to AI extend far beyond. The Harris-Russell podcast interview above certainly touched on every concern that has crossed my mind as I've thought about AI/IA, and then some.
Below, Sam Harris doing a TED Talk on AI.
__
UPDATE: I finished the "Four Futures" book. It is excellent. A fairly quick read. Peter Frase's concluding thoughts:
The transition to a world of abundance and equality, then, is likely to be a tumultuous and conflict-ridden one. If the rich won’t relinquish their privileges voluntarily, they would have to be expropriated by force, and such struggles can have dire consequences for both sides. For as Friedrich Nietzsche said in a famous aphorism, “Beware that, when fighting monsters, you yourself do not become a monster … for when you gaze long into the abyss, the abyss gazes also into you.” Or as the Communist poet Bertolt Brecht wrote in “To Posterity,” a revolution against a brutal system could itself brutalize those who participated in it.
Even anger against injustice Makes the voice grow harsh. Alas, we Who wished to lay the foundations of kindness Could not ourselves be kind.
Or as Mao put it in his characteristic blunt style, “a revolution is not a dinner party."
In other words, even the most successful and justified revolution has losers and victims. In a 1962 letter to the economist Paul Baran, the critical theorist Herbert Marcuse remarks that “nobody ever gave a damn about the victims of history.” The remark was directed at the hypocrisy of liberals who were eager to moralize about the victims of Soviet Communism but were silent about the massive human cost of capitalism. It’s a harsh, perhaps a cruel judgment, and Marcuse himself suggests the need to move beyond it. But it provides an important perspective on the exercise I’ve undertaken here, by allowing us to see that society’s four futures don’t fit into neat moral boxes. That is one danger, that we underestimate the difficulty of the path we must traverse, or that we allow the beauty of our endpoint to license unlimited brutality along the way. But another possibility is that, at journey’s end, we forget how arduous the journey was and who we left behind. Walter Benjamin, in his essay “On the Concept of History,” talks about the way that historical accounts necessarily tend to empathize with the victors, who are generally the ones who get to write the history. “Those who currently rule are however the heirs of all those who have ever been victorious. Empathy with the victors thus comes to benefit the current rulers every time.” But we can also say that even in a society without clear rulers, history will tend to empathize with the survivors; they are, after all, literally the only ones around to write it. Let’s revisit, on that note, the residents of our first, communist future. Perhaps they’re not at the end of the capitalist road to communism after all, but of a much longer and darker journey through the horrors of exterminism. Remember exterminism’s central problematic: abundance and freedom from work are possible for a minority, but material limits make it impossible to extend that same way of life to everyone. At the same time, automation has rendered masses of workers superfluous. The result is a society of surveillance, repression, and incarceration, always threatening to tip over into one of outright genocide. But suppose we stare into that abyss? What’s left when the “excess” bodies have been disposed of and the rich are finally left alone with their robots and their walled compounds? The combat drones and robot executioners could be decommissioned, the apparatus of surveillance gradually dismantled, and the remaining population could evolve past its brutal and dehumanizing war morality and settle into a life of equality and abundance— in other words, into communism.
As a descendant of Europeans in the United States, I have an idea of what that might be like. After all, I’m the beneficiary of a genocide. My society was founded on the systematic extermination of the North American continent’s original inhabitants. Today, the surviving descendants of those earliest Americans are sufficiently impoverished, small in number, and geographically isolated that most Americans can easily ignore them as they go about their lives. Occasionally the survivors force themselves onto our attention. But mostly, while we may lament the brutality of our ancestors, we don’t contemplate giving up our prosperous lives or our land. Just as Marcuse said, nobody ever gave a damn about the victims of history.
Zooming out a bit farther, then, the point is that we don’t necessarily pick one of the four futures: we could get them all, and there are paths that lead from each one to all of the others.
We have seen how exterminism becomes communism. Communism, in turn, is always subject to counterrevolution, if someone can find a way to reintroduce artificial scarcity and create a new rentist elite. Socialism is subject to this pressure even more severely, since the greater level of shared material hardship increases the impetus for some group to set itself up as the privileged elite and turn the system into an exterminist one.
But short of a civilizational collapse so complete that it cuts us off from our accumulated knowledge and plunges us into a new dark ages, it’s hard to see a road that leads back to industrial capitalism as we have known it. That is the other important point of this book. We can’t go back to the past, and we can’t even hold on to what we have now. Something new is coming— and indeed, in some way, all four futures are already here, “unevenly distributed,” in William Gibson’s phrase. It’s up to us to build the collective power to fight for the futures we want.
Again, you can begin with his antecedent Jacobin online essay, but the book puts some very nice additional meat on the bones. Some good discourse on the implications of AI/Robotics included.
My quickie Photoshop matrix rendering.
With the November 2016 election of the belligerent, vainglorious Donald Trump to the U.S. Presidency, I have to be concerned that we might be soon teetering more toward a Quadrant IV. I don't think there can be much doubt that the U.S. socioeconomic culture is principally in a Quadrant II mode these days. Whether we slip toward a fully predatory tooth and claw Quadrant IV remains to be seen.
NOTE: also highly recommended apropos of this vein of thought, Paul Mason's bracing book Postcapitalism.
apropos of employment projections, just in in my inbox, from The New Yorker:
...Otto, a Bay Area startup that was recently acquired by Uber, wants to automate trucking—and recently wrapped up a hundred-and-twenty-mile driverless delivery of fifty thousand cans of beer between Fort Collins and Colorado Springs. From a technological standpoint it was a jaw-dropping achievement, accompanied by predictions of improved highway safety. From the point of view of a truck driver with a mortgage and a kid in college, it was a devastating “oh, shit” moment. That one technical breakthrough puts nearly two million long-haul trucking jobs at risk. Truck driving is one of the few decent-paying jobs that doesn’t require a college diploma. Eliminating the need for truck drivers doesn’t just affect those millions of drivers; it has a ripple effect on ancillary services like gas stations, motels, and retail outlets; an entire economic ecosystem could break down.
Whether self-driving cars and trucks, drones, privatization of civic services like transportation, or dynamic pricing, all these developments embrace automation and efficiency, and abhor friction and waste. As Erik Brynjolfsson, a professor at the M.I.T. Sloan School of Management told MIT Technology Review, “Productivity is at record levels, innovation has never been faster, and yet at the same time, we have a falling median income and we have fewer jobs. People are falling behind because technology is advancing so fast and our skills and organizations aren’t keeping up.” It is, he said, “the great paradox of our era.”...
Finished the Gazzaley-Rosen book. It is excellent. Put a link in my permanent right-hand links column. You should get The Distracted Mind and read it.
I find The Distracted Mind directly analogous to Dan Lieberman's also excellent book The Story of the Human Body, which I cited earlier in the year.
"Evolutionary mismatch" ailments, both physical and neurological/cognitive.
Below, another book I've begun that may have some relevant "Distracted Mind" implications.
The neuroscience of attention, despite having greatly advanced over the past few decades, remains too primitive to explain comprehensively the large-scale harvesting of attention. At most it can shed light on aspects of individual attention. But there is one thing scientists have grasped that is absolutely essential to understand about the human brain before we go any further: our incredible, magnificent power to ignore...
Among other things with respect to The Distracted Mind, I've been looking for any useful implications for clinical workflow. apropos of that topic, the latest from Dr. Jerome Carter of EHRscience.com.
Workarounds, Disruptions, and Electronic Health Records
HITECH EHR incentives have been successful in driving EHR adoption. However, as more hospitals and practices have embraced HIT, the number of complaints of poor usability, workflow disruptions, and decreased productivity have grown. As a result, EHR systems have been one of the most important factors in bringing discussions of clinical workflow to the forefront. Of course, this does not mean that inefficient workflows did not exist prior to EHR systems, only that EHR systems provided sufficient contrast with known processes so that the differences became obvious.
Every clinical organization has policies and procedures that guide work activities. Strict adherence is rarely enforced, which gives those charged with carrying out said polices/procedures significant leeway in determining how they are done. If the form is filled out correctly, no one sweats the details. Electronic systems change this dynamic as they are supposed to “help.” As it turns out, the degree to which they help or hurt varies considerably. Some processes are more efficient, some less. Published studies on workarounds provide valuable information on how processes are affected by the presence of electronic systems…
While the research on workarounds reported in these papers is in reaction to EHR systems, its value goes beyond understanding EHR-related changes. Ultimately, the attention paid to common processes may prove to be more valuable. Why? EHR systems are designed to be patient information repositories, not clinical care assistants. As a result, supporting clinical work is seen as a data availability problem, not a process support problem. The underlying assumption is that providing data is the same thing as supporting processes. Workaround research demonstrates just how wrong this assumption is. Workarounds are workflow issues. Every workaround is an alternate path to the same goal.
Workflows consist of a series of steps and each step consumes or produces information, uses resources, and is performed by someone or something. If both groups of authors had rendered their findings in a formal process language (e.g., YAWL, BPMN, Colored Petri Nets) using acknowledged workflow patterns, their findings would have been easier to compare and possibly apply (say for software design).
As with workarounds, usability research has also exploded with EHR adoption. The most obvious usability issue with EHRs—they are designed to provide data, not support processes — remains underappreciated. Thankfully, this is slowly changing...
Uber-geek Chuck Webster is never at a loss for answers. From a comment he left at EHR Science last month:
Chuck Webster, MD, MSIE, MSIS @wareflo October 10, 2016 at 10:07 AM I like my definitions of workflow and workflow technology:
“I’ve looked at literally hundreds of definitions of workflow, all the way from a “series of tasks” to definitions that’d sprawl across several presentation slides. The one I’ve settled on is this:
“Workflow is a series of tasks, consuming resources, achieving goals.”
Short enough to tweet, which is why I like it, but long enough to address two important concepts: resources (costs) and goals (benefits).
So what is workflow technology? Workflow technology uses models of work to automate processes and support human workflows. These models can be understood, edited, improved, and even created, by humans who are not, themselves, programmers. These models can be executed, monitored, and even systematically improved by computer programs, variously called workflow management systems, business process management suites, and, for ad hoc workflows, case management systems.
Workflow tech, like health IT itself, is a vast and varied continent. As an industry, worldwide, it’s probably less than a tenth size of health IT, but it’s also growing at two or three times the rate. And, as both industries grow, they increasingly overlap. Health IT increasingly represents workflows and executes them with workflow engines. Workflow tech vendors increasingly aim at healthcare to sell a wide variety of workflow solutions, from embeddable workflow engines to sprawling business process management suites. Workflow vendors strenuously compete and debate on finer points of philosophy about how best automate and support work. Many of these finer points are directly relevant to workflow problems plaguing healthcare and health IT.
Why is workflow tech important to health IT? Because it can do what is missing, but sorely needed, in traditional health IT, including electronic health records (EHRs). Most EHRs and health IT systems essentially hard-code workflow. By “hard code” I mean that any series of tasks is implicitly represented by Java and C# and MUMPS if-then and case statements. Changes to workflow require changes to underlying code. This requires programmers who understand Java and C# and MUMPS. Changes cause errors...
Process-aware tech, in comparison to hardcoded workflows, is an architectural paradigm shift for health IT. It has far reaching implications for interoperability, usability, safety, and population health.
BPM systems are ideal candidates to tie together disparate systems and technologies. Users experience more usable workflows because workflows are represented so humans can understand and change then. Process-aware information systems are safer for many reasons, but particularly because they can represent and compensate for the interruptions that cause so many medical errors. Finally, BPM platforms are the right platforms to tie together accountable care organization IT systems and to drive specific, appropriate, timely action to provider and patient point-of-care.”
“Workflow is a series of tasks, consuming resources, achieving goals.”
Ummm... "blinding glimpse of the obvious?"
One of my old summary WKFL takes here, from my MU days with HealthInsight: "Workflow Demystified" (pdf).
BTW, I have to take a bit of issue with this Chuck Webster assertion from above:
Most
EHRs and health IT systems essentially hard-code workflow. By “hard
code” I mean that any series of tasks is implicitly represented by Java
and C# and MUMPS if-then and case statements. Changes to workflow
require changes to underlying code.
That may be true for some really old "legacy" EHRs, but it's a bit of overstatement in general. Yes, technically, if you want to change the layout order of appearance (or other aesthetic attributes) of various data cells on any given EHR tab or template, you might have to go "under the hood" and re-write the source code (such was in case in my ancien lab apps coding days in Oak Ridge in the 80's).
Ugh. Wrote about that here (pdf), published in the EPA Conference Proceedings the and journal Radioactivity and Radiochemistry.
You"might have to" (alter source code). Newer software systems of all types now typically have built-in click/drag-and-drop/cut & paste functionality wherein the user authoring/editing interface serves as a "source code generator" (like, well, this very blogger.com authoring app I use for this and other blog postings. And, yeah. sometimes I have to go in and edit the html code here when something doesn't present as intended, given the bugs in the platform. But, rarely).
Beyond that, the actual workflows, in terms of getting a user to the intended data entry/review/update targets, are independent of the exact screen placement of individual data cells on a given screen.
To this point, once the end-user is adequately trained up past the "learning curve" on a given system, the precise screen location of individual data cells becomes a fairly trivial facet of workflow relative to the resources consumption of getting there and then entering data and moving on.
When I was working in Meaningful Use at my REC, every ONC-certified EHR system (of the 15 or so in my personal MU client caseload, anyway) had multiple keystroke/mouse-click paths via which to direct the user to each MU documentation compliance criterion. The workflow modification task, then, was that of minimizing path travel, greasing the workflow skids.
And, I used to grouse that all repetitive MU data destination paths (there aren't that many) could/should be "one-stroke" -- i.e., via a set of "macros" (common to virtually all major modern commercial software). In fact, I'd have made one-stroke/click MU target functionality a condition of EHR MU Certification.
Anyone recall the old Windows "recorder.exe" utility? Microsoft killed it; nowadays you have to buy (relatively inexpensive) 3rd party macro utility apps. Still, given the workflow enhancement functionality, they're worth it in any "productivity treadmill" environment, particularly those of the irreducibly high cognitive burden medical clinic and hospital.
__
BACK TO THE DISTRACTED MIND
The Gazzaley-Rosen book is comprised of three thematic sections spanning its eleven chapters: [1] what neuroscience has learned about the evolution and functionality of the brain, including the central adaptive elements of cognition [2] the increasingly adverse impacts of our 24/7 (mostly now digital) information overload ("information foraging") first-world culture, and [3] what, if anything, can be done to mitigate or otherwise counter those adverse impacts.
The Distracted Mind is not a pseudo-science book that offers colorful brain scans and questionable neuroscience as a way of making a topic appear more credible. In this book, we apply our complementary scientific lenses to present timely and practical insights. Dr. Adam Gazzaley is a cognitive neuroscientist and a trailblazer in the study of how the brain manages distractions and interruptions. Dr. Larry Rosen is a psychologist who has studied the “psychology of technology” as a pioneer in this field for more than thirty years. Our complementary perspectives focus on demonstrating why we fail to successfully navigate our modern technological ecosystem and how that has detrimentally affected our safety, cognition, education, workplace, and our relationships with family and friends. We enrich this discussion with our own research and scientific hypotheses, as well as views of other scholars in the field, to explain why our brains struggle to keep up with demands of communication and information.
We present our perspectives in three parts. In Part I, we will take you on a tour through new insights into why our “interference dilemma” exists in the first place and why it has become so relevant to us now. We describe how the very essence of what has evolved furthest in our brains to make us human—our ability to set high-level goals for ourselves—collides headfirst with our brain’s fundamental limitations in cognitive control: attention, working memory, and goal management. This collision results in our extreme sensitivity to goal interference from both distractions by irrelevant information and interruptions by attempted multitasking. This noise degrades our perceptions, influences our language, hinders effective decision making, and derails our ability to capture and recall detailed memories of life events. The negative impact is even greater for those of us with undeveloped or impaired cognitive control, such as children, teens, and older adults as well as many clinical populations. We further discuss why we engage in high-interference-inducing behaviors from an evolutionary perspective, such that we are merely acting in an optimal manner to satisfy our innate drive as information-seeking creatures.
In Part II, we will share a careful assessment of our real-world behaviors and demonstrate how the collision described in Part I has been intensified by our constant immersion with the rich landscape of modern information technology. People do not sit and enjoy a meal with friends and family without checking their phones constantly. We no longer stand idle in waiting lines, immersed in thought or interacting with those next to us. Instead, we stare face down into virtual worlds beckoning us through our smartphones. We find ourselves dividing our limited attention across complex demands that often deserve sustained, singular focus and deeper thought. We will share our views of why we behave in such a manner, even if we are aware of its detrimental effects. Building a new model inspired by optimal foraging theory we explain how our high-tech world perpetuates this behavior by offering us greater accessibility to feed our instinctive drive for information as well as influencing powerful internal factors, such as boredom and anxiety. We are most certainly ancient brains in a high-tech world.
Finally, in Part III we offer our perspectives on how we can change our brains to make them more resilient, as well as how we can change our behavior via strategies to allow us to thrive in all areas of our lives. We first explore the full landscape of potential approaches available to us—from the low-tech to the high-tech—that harness our brain’s plasticity to strengthen our Distracted Mind. This in-depth examination includes traditional education, cognitive training, video games, pharmaceuticals, physical exercise, meditation, nature exposure, neurofeedback, and brain stimulation, illustrating how in these fascinating times the same technologies that aggravate the Distracted Mind can be flipped around to offer remediation. We then share advice on what we can do from a strategic perspective to modify our behavior, without abandoning modern technology, such that we minimize the negative consequences of having a Distracted Mind. Using the optimal foraging model introduced earlier in the book as a framework to approach behavioral change, all of the strategies we offer are practical and backed by solid science.
The Distracted Mind will enlighten you as to how and why our brains struggle to manage a constantly surging river of information in a world of unending interruptions and enticements to switch our focus. We extend this perspective to explore the consequences of this overload on our ability to function successfully in our personal lives, on the road, in classrooms, and in the workplace, and address why it is that we behave in this way. Critically, we provide solid, down-to-earth advice on what we need to do in order to survive and thrive in the information age.
Gazzaley, Adam; Rosen, Larry D. (2016-09-16). The Distracted Mind: Ancient Brains in a High-Tech World, locations 117-151 (MIT Press). The MIT Press. Kindle Edition.
Distracted Mind stuff that goes specifically to the workplace and "workflow" -
WORKPLACE INTERFERENCE For those of us who work with technology and are surrounded by other employees working with their technologies, interference has become the norm. We are constantly interrupted by others dropping by our desk to chat or attempting to connect with us through a variety of technological communication modalities, including the most popular workplace tool—email. A study by Judy Wajcman, a sociology professor at the London School of Economics, highlighted this phenomenon by shadowing eighteen employees of an Australian telecommunications company during their entire workday. Wajcman selected this company because it was designed to facilitate interactions between workers with open-plan offices and other external distractors, including many large television screens mounted around the office. The employees in this study spent only half their workday on actual “work episodes,” which included any and all work-related activities. Strikingly, most of these work episodes lasted ten minutes or less, with an average of just three minutes per work episode. And even more interesting, nearly two-thirds of the work episode interruptions were self-generated, and most of those involved some form of mediated communication using a technological device. In fact, of the approximately eighty-six daily changes in an employee’s work activity, the workers themselves generated sixty-five of them internally, with the vast majority involving “checking in” with no obvious external alert or notification. Even without the “You’ve Got Mail” notification, these workers checked their email anyway and continued to check other sources of electronic communication and information without being externally directed to do so.
Whether directed externally via an alert or notification or internally by an unseen process, it appears that in the work environment email and other communication modalities bear a major responsibility for interruptions. One field study that followed workers for two weeks discovered that they were interrupted 4.28 times per hour by email and an additional 3.21 times by instant message communications. And these communications appeared to have a strong draw for the employees, since 41 percent of them responded to the email immediately and 71 percent responded to an instant message immediately. On average, the workers spent ten minutes dealing with the alerts and then took an additional ten to fifteen minutes to return to their appointed task, often visiting several other applications in the interim. Another study by the research group ClearContext indicated that more than half of the 250 workers they queried spent over two hours a day reading and responding to email. A study out of Loughborough University in England found that after dealing with an email, which itself took an average of just under two minutes, it took the studied workers an average of 68 seconds—more than half of the time required to read and respond to that email—to return to their work and remember what they were doing. This study also found that people are responding like Pavlov’s dogs to incoming email communication, waiting only an average of one minute and forty-four seconds to open that message. Strikingly, 70 percent of those alerts were attended to within six seconds, which is about the time it takes a phone to ring three times. And yet another study found that even without an alert, while one in three people claimed to check their email every fifteen minutes, they actually checked it about every five minutes. We are self-interrupting and not even aware of how often we are diverting our attention from our main task—in this case, our job—to another task that may be completely unrelated to work... [ibid, pp. 113-115]
We all mostly have a naive conceit that we are adroit "multitaskers." Reading this book should change our minds. The science says otherwise.
All very interesting, all of it. I would make this book required reading for Health IT "UCD/UX" designers (the "User Centered Design" / "User Experience" peeps). I would also add clinical management to the "required readers" list, given that the implications go squarely to policy, not InfoTech UX design. A progressive, workflow-friendly EHR that comes out of the development oven only to get deployed in a high interruption/distraction work environment way well have its slick UX negated anyway.
Another aspect to bear in mind. It has become widely fashionable these days to swax rhapsodic over our putatively increasingly "post-EHR" world, a world dominated by "mHealth" (mobile health apps), with patients busily burying their docs with user-generated metrics (of wildly variant SOAP utility and data QA pedigrees) and random patient queries.
More interruptions and distractions (beyond the intractable "interoperability" issues). Just what clinicians need and want.
From the closing chapter of The Distracted Mind:
11 MODIFYING BEHAVIOR IT SHOULD NOW be abundantly clear that we live amidst a level of high-tech interference that in the past decade or so has dramatically changed the world, and along with it our thoughts, feelings, and behaviors. In Part II we explored the many ways in which modern technology has aggravated our Distracted Mind; from awakening in the morning until trying to fall asleep at night, we are tempted by technological distractions and interruptions. As we have shown, three main game changers—the Internet, smartphones, and social media—have forever altered our mental landscape. We have painted a detailed picture based on solid research from a variety of fields showing that we are spending our days switching from task to task and affording each only our divided attention.
Recall the cognitive control limitations that we presented in chapter 5 in the domains of attention (selectivity, distribution, sustainability, processing speed), working memory (capacity, fidelity), and goal management (multitasking, task switching). As described, high-tech influences stress these limitations in just about every possible way: they challenge our attention abilities via frequent distractions, fragment our working memory and diminish its fidelity through interruptions, and drive us to excessive multitasking and task switching, all of which introduce performance costs. In terms of the MVT model, introduced in Part I and elucidated in chapter 9, modern technology has caused this by diminishing the time in which we remain engaged with an information source, causing us to shift to another patch before we have exhausted the information in our current source. We are like a squirrel with an attention disorder, constantly jumping from tree to tree, sampling a few tasty morsels and leaving many more behind as he jumps to the next tree, and the next and the next. It sounds exhausting, and, as we have shown, it is negatively affecting our safety, relationships, school and job performance, and mental health... [ibid, pp. 213-214]
CODA
From the final passages of "The Attention Merchants."
The past half century has been an age of unprecedented individualism, allowing us to live in all sorts of ways that were not possible before. The power we have been given to construct our attentional lives is an underappreciated example. Even while waiting for the dentist, we have the world at our finger tips: we can check mail, browse our favorite sites, play games, and watch movies, where once we had to content ourselves with a stack of old magazines. But with the new horizon of possibilities has also come the erosion of private life’s perimeter. And so it is a bit of a paradox that in having so thoroughly individualized our attentional lives we should wind up being less ourselves and more in thrall to our various media and devices. Without express consent, most of us have passively opened ourselves up to the commercial exploitation of our attention just about anywhere and anytime. If there is to be some scheme of zoning to stem this sprawl, it will need to be mostly an act of will on the part of the individual.
What is called for might be termed a human reclamation project... [Wu, Tim op cit, location 6471]
This Election Had Medical Consequences—And I Gave Them a Name I’m a sleep disorder specialist, and many of my patients couldn’t sleep because of the election—and the anxieties that underlay it. Qanta Ahmed For months now, I have left the exam room after a patient visit feeling unsettled- not by my patient’s clinical challenges but by the suffering of my fellow American. Dozens of my patients report fear, sleeplessness, dread, worry and dejection over “America,” “the future,” “what’s coming next?” or, in these past days, what happens “after Election Day.” Some are felled by the shock of the Trump presidency that will soon be a reality, and many have spent long moments sobbing, heartbroken in my office. Countless are torn at the state of the country and having no choice. Noticing this so often these past months, I came up with a name for these symptoms: Election Dysthymia...
Interesting. She cites this book:
Anthropologist Hugh Gusterson and Catherine Besteman edited a fascinating volume of essays, The Insecure American: How We Got Here and What We Should Do About It. The book, with an eloquent forward by Barbara Ehrenreich, describes how Americans once believed themselves intrepid. But defeat in Vietnam, the energy crisis, our undeniable economic decline and wage stagnation changed all that. Reading the book rings frighteningly true to me- my patients embody many of these essays. Every day I meet patients working sixteen-hour days or longer, married couples holding down four sometimes five poorly paid jobs between them struggling to make ends meet. And, after almost twenty five years treating Americans I am struck by their escalating poverty- both financial and in quality of life, despite their extraordinary work ethic, an ethic which all too often costs them their health...
"The book, with an eloquent forward by Barbara Ehrenreich..."
to wit,
Fifty or sixty years ago, the word insecurity most commonly referred to a psychological condition. Some people suffered from “insecurities”; otherwise, though, Americans were self-confident to the point of cockiness. Public intellectuals worried over the “problem” of affluence, which was believed to be making us too soft and contented. They held forums to consider the growing challenge of leisure, never imagining that their own children and grandchildren would become accustomed to ten-hour workdays. Yes, there remained a few “social problems” for sociologists to study—poverty, which was “discovered” by the nonpoor in the early sixties, and racial inequality—but it was believed that these would yield easily to enlightened policies. We were so self-confident that Earth itself no longer seemed to offer sufficient outlets for our energy and ambition. We embarked on the exploration of space.
It was at some point in the late 1960s or early 1970s that Americans began their decline from intrepid to insecure. The year 1969 brought the revelation of the massacre at My Lai and the certainty that the Vietnam War would end in disgrace as well as defeat. At the same time, the war was draining federal funds from Lyndon Johnson’s Great Society programs, vitiating health services and hundreds of community development projects. Then 1970 saw the first national observance of Earth Day and the dawning awareness that our environmental problems went beyond scattered cases of “pollution.” For the first time since Malthus, the possibility was raised that we might someday exhaust the resources required to maintain America’s profligate consumer culture.
American business, beginning with the auto industry, woke up, in the 1970s, to the threat of international competition and initiated its long campaign to reduce both wages and the number of American workers. By the 1980s, big business had started the dismantling of American manufacturing—sending the factories overseas and destroying millions of unionized blue-collar jobs. The white-collar workforce discovered that even they were no longer safe from the corporate winnowing process. In the old version of the American dream, a college graduate was more or less guaranteed a middle-class lifestyle.
In the emerging version, there were no guarantees at all. People were encouraged to abandon the idea of job security and take on the project of “reinventing” themselves over and over, as the fickle job market required—to see themselves as perpetual salespeople, marketing “the brand called you.”
Meanwhile, under both Ronald Reagan and Bill Clinton, the old confidence that we could mobilize collectively to solve social problems like poverty and racial exclusion was replaced by a growing mean-spiritedness toward the unlucky, the underpaid, and the unwanted. The war on poverty gave way to a war on crime, and when there were not enough crimes to justify this massive punitive enterprise, the authorities invented new ones—like the “crime” of drug possession and use. America achieved the embarrassing distinction of having the highest proportion of its citizenry incarcerated, surpassing both Russia and South Africa under apartheid.
Even into the new millennium, which brought the threat of terrorism and the certainty of global warming, we held our insecurities at bay with a combination of scapegoating, distraction, and delusion. Gays and illegal immigrants became our designated scapegoats, regularly excoriated by evangelists and cable news anchormen. War was at least a temporary distraction, even though it was the greatest non sequitur in military history: attacked by a group consisting largely of Saudi Arabians, the United States invaded Iraq. And then, at the personal level, there was the illusion of affluence offered by easy credit. If our jobs no longer paid enough to finance anything resembling the American dream of home ownership and college for the children, we could always borrow—take on a dodgy mortgage, refinance the house, sign up for more credit cards.
But distraction and delusion are not long-term cures for underlying anxiety. This book comes out at a time when more and more Americans are tumbling from insecurity into insolvency—bankrupted by medical debts, made homeless by foreclosure, ousted from their jobs by layoffs. The credit crisis that began in 2007, combined with stunning increases in the cost of fuel and ever-growing economic inequality, has created challenges not seen since the eve of the Great Depression. As I write this, the overwhelming majority of Americans believe that the country is “headed in the wrong direction” and fear that they will be the first generation to see their children live in more straitened circumstances than they have known.
The Insecure American would have been essential reading at any time in the last few years, but today it is indispensable. For the most part, we confront problems and issues only as they arise in the news cycle, taking them from sources usually short on facts and devoid of analysis. In contrast, the contributors to this book have been researching and thinking about their subjects—from militarism to health care, from foreign policy to poverty—for years. Many are academics who teach as well as write, and here they offer a powerful overarching lesson in clear and down-to-earth prose: that we can understand the forces that have robbed us of security, and—through understanding, combined with a renewed commitment to collective action—overcome them.
Barbara Ehrenreich
I extracted that from the Amazon "Look Inside" preview, which I also captured in full as a PDF. A fairly generous sample. One well worth your time. Notwithstanding that the book has a 2010 copyright date, it rings true to this election year and month. The takeaway is that "it's not like we couldn't see this coming."
"...more and more Americans are tumbling from insecurity into insolvency—bankrupted by medical debts, made homeless by foreclosure, ousted from their jobs by layoffs. The credit crisis that began in 2007, combined with stunning increases in the cost of fuel and ever-growing economic inequality, has created challenges not seen since the eve of the Great Depression..."
Yeah. I had a nano-role in the run-up to the FIRE sector crash of 2008, owing to my 2000-2005 tenure working in subprime risk analytics. Wrote about those experiences on another of my blogs. See "Tranche Warfare" and "The Dukes of Moral Hazard."
More Qanta Ahmed:
I am a sleep disorders specialist. People come to see me when they have trouble sleeping. Many of my patients have mental health issues that improve greatly when I treat their sleep disorders. While all my patients see me because they have sleep disorders, the intensity, the depravity and the relentlessness of the 2016 election cycle have resulted in an additionally corrosive assault on my patients...
I am getting close to finishing this book, which also delves into sleep dysfunction (mostly in the context of the adverse impact of 24/7 digital InfoTech obsessions).
Most of us will freely admit that we are obsessed with our devices. We pride ourselves on our ability to multitask -- read work email, reply to a text, check Facebook, watch a video clip. Talk on the phone, send a text, drive a car. Enjoy family dinner with a glowing smartphone next to our plates. We can do it all, 24/7! Never mind the errors in the email, the near-miss on the road, and the unheard conversation at the table. In The Distracted Mind, Adam Gazzaley and Larry Rosen -- a neuroscientist and a psychologist -- explain why our brains aren't built for multitasking, and suggest better ways to live in a high-tech world without giving up our modern technology. The authors explain that our brains are limited in their ability to pay attention. We don't really multitask but rather switch rapidly between tasks. Distractions and interruptions, often technology-related -- referred to by the authors as "interference" -- collide with our goal-setting abilities. We want to finish this paper/spreadsheet/sentence, but our phone signals an incoming message and we drop everything. Even without an alert, we decide that we "must" check in on social media immediately... - from the Amazon blurb
Will be reviewing it soon. For now a little snippet on sleep:
Research has also demonstrated that getting too little sleep can disturb memory in important ways. Of course, memory is a complex process that involves multiple areas of the brain, but one key component to memory is a solid cognitive control system without which the information would never get transmitted effectively or completely to memory centers such as the hippocampus. One study found that adults who routinely slept less than five hours a night were more likely to incorporate misinformation in their morning report of either photos or videos that they had observed before bedtime; some even reported that they had seen video footage of an event that never happened.
Gazzaley, Adam; Rosen, Larry D. (2016-09-16). The Distracted Mind: Ancient Brains in a High-Tech World (MIT Press) (p. 140). The MIT Press. Kindle Edition.
"...reported that they had seen video footage of an event that never happened."
Anyone come to mind here? Say, a President-elect who prides himself on his manly 4 hours of nightly sleep?
BTW: There are implications for Health IT UX in The Distracted Mind. Stay tuned.
UPDATE
Excellent new post up on THCB:
The Age of Trumpian Uncertainty The new Chief Executive Officer of the United States of America Inc. will take office January 20th and likely make good on his promise to repeal the Patient Protection and Affordable Care Act. It only requires a majority in both houses of Congress to pass and that’s assured based on the election results last week...
Should healthcare in the United States be approached as a fundamental right or a privilege? In the constitution, life, liberty and the pursuit of happiness are guaranteed. Is access to a healthcare system a fundamental right for all in this country or is it reserved for those who can afford its services? In our system of government, we’ve concluded that education is a fundamental right for all. Is healthcare akin or different? And what’s included in that assurance: basic services for all, or exactly what?...
Nearly seventy per cent of working-age Americans lack a bachelor’s degree. Many of them saw an establishment of politicians, professors, and corporations that has failed to offer, or even to seem very interested in, a vision of the modern world that provides them with a meaningful place of respect and worth... As the new Administration turns to governing, the mismatch between its proffered solutions and our aspirations and ideals must be made apparent. Take health care. Eliminating Obamacare isn’t going to stop the unnerving rise in families’ health-care costs; it will worsen it. There are only two ways to assure people that if they get cancer or diabetes (or pregnant) they can afford the care they need: a single-payer system or a heavily regulated private one, with the kind of mandates, exchanges, and subsidies that Obama signed into law. The governor of Kentucky, Matt Bevin, was elected last year on a promise to dismantle Obamacare—only to stall when he found out that doing so would harm many of those who elected him. Republicans have talked of creating high-risk insurance pools and loosening state regulations, but neither tactic would do much to help the people who have been left out, like Jim Young’s family. If the G.O.P. sticks to its “repeal and replace” pledge, it will probably end Obama’s exchanges and subsidies, and embrace large Medicaid grants to the states—laying the groundwork, ironically, for single-payer government coverage...
Those of us who opposed and voted against Donald Trump are now members of the "loyal opposition." But, it is critical to continue to focus on that to which we are properly to be "loyal." The Constitution, yes. But more importantly, the ideals of equal justice for which it ostensibly stands.
Loyalty to the Constitution presupposes loyalty to the office of President it sets forth in Article II. But, that is a separate consideration from unreflective, unquestioning deference to the person who holds that office at any given time irrespective of how he (and, still a "he") behaves once in power.
To the extent that the new President Trump comports himself within the established confines of the Constitution, fine. We will continue to have the First Amendment right to peacefully disagree vocally with specific policy proposals and actions (which always happens, and is to be expected in a free society), but Mr. Trump should never be permitted to forget that loyalty is a two-way street.
It may become necessary for someone to read the Declaration of lndependence to him, given the voluminous record of autocratic, belligerent statements he has left us with thus far. That perhaps he doesn't actually believe all that crass, incendiary campaign shit is no less of a cause for concern.
Going to be an interesting year, my friends.
Also, Trump has made opaque pander-to-the-Fundies allusions regarding having Roe vs Wade overturned.
With respect to human reproduction (in particular the "life begins at conception" canard), the contribution of the male begins and ends with the sperm's successful delivery of the polymer molecules comprising ONE copy of the male's DNA that constitute his 23 chromosomes. Everything that takes place thereafter, starting with the ensuing division process begun by the single cell sperm-fertilized ovum, is a function of the female's gestational biology. all of the subsequent biological reproductive effort, and all of the medical risk.
The assertion that [1] a fertilized ovum is instantly a "person," and, [2] men should be able to declare for themselves "equal reproductive rights" tantamount to a veto over what a woman does with her body once pregnant are the most ignorant and arrogant things I've ever heard. It has nothing to do with "reverence for the sanctity of life," it has everything to do with reverence for dominant male power. Period.
Moreover, take men out of the picture, just to advance the argument a step further. The notion that some women should be able to use the force of law to deny other women the right to control their biology is equally specious. For starters, it violates the Equal Protection clause of the 14th Amendment.
Don't like abortion? Don't have one.
Then there's the issue of "dismantling regulations."
Consistent with his broad ignorance, Trump has no clue regarding "regulations." He claims he will just get rid of 75 to 80% of all federal regulations. First of all (beyond the fact that doing so is the purview of Congress, not the President), an apt analogy to law and regulation is the corporate "policies and procedures" we also all hate.
Laws tell us "what" and "why," and subsequent regulations tell us "who," "how," and "when" (the operational details). Regulations at the federal level are published in draft as Federal Register proposals, which undergo a prolonged period of public review and comment before they are finalized. Moreover, they cannot exceed the scope of the parent law. To the extent they do they are very quickly challenged in court and either overturned or scaled back. The various stakeholder groups are vigilant with respect to proposed regulations. We may not like the volume and complexity of them, but that's simply a function of the way legislation works in a nation of 330 million people.
I have worked in highly regulated environment for 30 years. EPA, NRC, OSHA, OCC, FDIC, and HHS. In my last job, I was the team lead on our staff for writing policies and procedures that were compliant with HIPAA medical privacy and security regulations covering our statewide Nevada Health Information Exchange. I know a thing or two about how this stuff works.
It's maddeningly tedious and complicated and imperfect. But, human affairs get regulated one way or another.
You want polluted water and air, toxic drugs, unsafe vehicles, poisoned foods, airplane crashes routinely?
Fine, do away with regulations.
POST-ELECTION PHOTOJOURNALISM UPDATE
The first new job created by the incoming Trump® Administration.
"We
have to come up, and we can come up with many different plans. In fact,
plans you don't even know about will be devised because we're going to
come up with plans, -- health care plans -- that will be so good. And so
much less expensive both for the country and for the people. And so
much better.”
Well, the Republicans will have control of the Presidency, Senate, and House beginning in January. Will they actually repeal "ObamaCare?" And, if so, what, if anything, will replace it?
Trump upset will force healthcare leaders to rethink the future By Harris Meyer, November 9, 2016 Republican Donald Trump's shocking victory Tuesday will force a major shift in the healthcare industry's thinking about its future. Combined with the GOP's retention of control of the Senate and the House, a Trump presidency enables conservatives to repeal or roll back the Affordable Care Act and implement at least some of the proposals outlined in the GOP party platform and the recent House Republican leadership white paper on healthcare.
Addressing supporters just before 3 a.m. ET, Trump struck a conciliatory tone and did not specifically mention the ACA. “It is time for us to come together as one united people,” he said. “It's time.”
But the assumption of Republican control over both the White House and Congress most likely means an end to the expansion of Medicaid to the 19 states that have not yet implemented it, and puts the expansion in the other 31 states in serious jeopardy.
But there are divisions even among conservatives over issues such as Medicare restructuring and how to help Americans afford health insurance. And Senate Democrats almost certainly would try to use their filibuster power to block major ACA changes...
The election results are in and Donald Trump will be the 45th president of the United States. His appeal to “Make America Great Again” resonated across the heartland sparking an unprecedented political upset that surprised even the most astute prognosticators and pundits.
When he takes office in January, he’ll face enormous challenges domestically and globally. Healthcare will be at the top of the list: he promised to “Repeal and Replace” the Affordable Care Act, and he pledged changes that strengthen the system in his campaign’s seven-point plan. In this effort, his team will face harsh realities...
What Will Republicans Do about Obamacare Now? by CHRIS JACOBS, November 9, 2016 They’ve campaigned against the health-care law for years. Now that they’ve won the presidency and Congress, the path to repeal and replace is still far from clear.
For the past six years, Republicans — across Washington, and across the country — have virtually to a person run against Obamacare. Their campaign has helped them win numerous House and Senate seats, a majority of governorships, and now has given them unified control of Washington for the first time in 15 years. Like the dog that finally caught the proverbial car, Republicans will wake up Wednesday morning asking themselves — on Obamacare, as on many other issues — “What now?”
The answer might be less obvious than it first appears. Democrats used the decade and a half between the defeat of Hillary Clinton’s health plan in 1993–94 and the 2008 election to develop a consensus architecture about what their ideal health-care plan would look like. In the Democratic primaries that year, Senators Clinton and Obama disagreed strongly on the necessity of an individual mandate to purchase coverage — a difference they litigated very publicly, and at great length, during the primary campaign — but agreed on virtually everything else.
By contrast, Republicans spent comparatively little time debating the finer points of an Obamacare alternative during the presidential cycle just concluded. Donald Trump promised “something terrific” that would tear down “the lines around the states,” but details were few and far between (and occasionally self-contradictory). Speaker Ryan’s House Republican task force produced a plan, but one with few fiscal details attached, and one that few in Washington — whether media analysts or policy-makers themselves — spent time dissecting or debating...
Some thoughts the day after the election November 9, 2016, Aaron Carroll This is going to be long, but I hope you’ll bear with me. I have a lot on my mind, and I have some thoughts I’d like to share with you. Clearly last night was a shock for many, many people. I’m being bombarded with questions from family, friends, and people I don’t know about what will happen, both with health care and America at large. Rather than try and handle those queries piecemeal, I’d rather make use of the blog – something that’s been hard to do during the election.
I have written more than once on this blog that “we do not strut” when things go our way. We don’t “celebrate”. We acknowledge the policy gain and then we get back to work trying to make the health care system of the United States better than it is now.
The same still holds when things don’t go our way. We don’t sulk. We get back to work trying to make the health care system better...
Five disgruntled Republican physicians and one nurse warmed up the conference room at a Hilton DoubleTree hotel in Pennsylvania yesterday. All were members of the U.S. Congress, gathered for a rally that Donald Trump would call “a meeting talking about health care.”
On the day that health-insurance enrollment began for 2017, the Trump campaign chose to keep the focus on medicine. Health-insurance premiums will be increasing in exchanges for some two to five percent of Americans this year. The U.S. has the most expensive health-care system in the world and needs a great deal of improvement. The candidate has been hostile toward the current health-care system and toward Hillary Clinton’s proposals, but has offered almost nothing in the way of solutions. Perhaps today was the day for a plan.
First to the mic stepped John Barrasso, an orthopedic surgeon from Wyoming. “People need a better way, and that is why we are here,” he said, setting the stage. “There is a better way than Obamacare.” He said that he believes that “government is the problem” and that Hillary Clinton is proposing “more Obamacare,” though it was unclear what part of the system he was referring to. “Republicans are here today to offer solutions,” he said...
And, from STATnews:
What does Donald Trump’s win mean for science and medicine? By DAMIAN GARDE @damiangarde NOVEMBER 9, 2016 It’s hard to escape the din of the nation’s prognostication industry coming to terms with its wrongness today, with so many teeth to gnash and garments to rend.
But here in the world of science and medicine, the election of Donald Trump has left many trying to make sense of the vagaries, reversals, and red herrings that have marked his rhetoric on key issues from research funding to drug pricing...
I hate to interrupt the festivities, but I have a few questions. There are one or two little unknowns here. The answers to these questions are matters of life and death to many in the industry, literal life and death to many thousands of patients, organizational life and death to thousands of companies, hospitals and systems.
Tuesday’s extraordinary events obviously present an enormous challenge for anyone who wants to think about the future of healthcare. The challenge is far more than simply trying to imagine the healthcare industry without Obamacare, or under whatever Trumpcare will turn out to be. A much more powerful effect will be come into play far earlier: the uncertainty over that future will have reshape the industry before we even get to the actual “repeal and replace” part...
"The tech industry’s relationship with government — not to mention the public — looks bound to shift in a fundamental way." During the Obama years, Silicon Valley came to see itself as the economic and social engine of a new digital century. Smartphones and social networks became as important to world business as oil and the automobile, and Amazon, Apple, Facebook, Google and Microsoft rose to become some of the most prosperous and valuable companies on the planet.
Mr. Obama, who rode many of these digital tools to the presidency, was accommodative of their rise; his administration broadly deferred to the tech industry in a way that bordered on coziness, and many of his former lieutenants have decamped to positions in tech.
Mr. Trump’s win promises to rip apart that relationship. The incoming president had few kind words for tech giants during the interminable campaign that led to his victory. Mr. Trump promised to initiate antitrust actions against Amazon, repeatedly vowed to force Apple to make its products in the United States, and then called for a boycott of the company when it challenged the government’s order to unlock a terrorist’s iPhone. Mr. Trump’s immigration plans are anathema to just about every company in tech...