NOW REPORTING FROM BALTIMORE. An eclectic, iconoclastic, independent, private, non-commercial blog begun in 2010 in support of the federal Meaningful Use REC initiative, and Health IT and Heathcare improvement more broadly. Moving now toward important broader STEM and societal/ethics topics. Formerly known as "The REC Blog." Best viewed with Safari, FireFox, or Chrome. NOTES, the Adobe Flash plugin is no longer supported. Comments are moderated, thanks to trolls.
Search the KHIT Blog
Sunday, May 31, 2015
R.I.P. Beau Biden
My heart is with the Biden family, which has had more than its share of tragedy. As a father who lost a daughter to cancer (it was Sissy's brain met tumor that finally did her in), I have a bit of an idea of what they're enduring right now. This is very, very sad. Beau was only 46.
Tuesday, May 26, 2015
Cumulative Meaningful Use attestations update
These comprise accrued data through March 2015. Notice that 2014 attestation incentive payments were roughly only half those of 2013 (51.4%). $30 billion paid out total.
Pretty much all that remain are reimbursement penalties.
UPDATE
Attestations to date by EHR vendors, via Healthcare IT News.
No surprises here, 'eh?
My newest read.
Just as we are getting to grips with the idea of sequencing millions of genomes, evidence is suggesting that even one per person might not be enough. The dogma that each of us has one genome to sequence is crumbling under the weight of evidence. It seems that we might be genomic mosaics and the new paradigm could be ‘one human, multiple genomes’.
The most common source of our multiple human genomes is cancer. Genetic disease is conventionally thought to arise from inherited genetic lesions found in the germ line— the sperm and eggs that combine to form the first human cells from which we all grow. In contrast, cancer is a disease that can arise from genetic mutations occurring within cells in the body— somatic cells (for soma, meaning body). Cancerous cells are aggressive in their attempts to grow and spread to places they are not meant without permission.
We all possess precancerous or slow-growing cancerous cells. In an autopsy study of six individuals, high rates of cellular mosaicism were found across different tissues. Mosaics were classified as having one or more large insertions, deletions, or duplications of DNA compared to the original ‘parent genome’ created at conception.
Mosaicism goes far beyond cancer. An increasing number of somatic mutations are being linked to other genetic diseases. These include neurodevelopmental diseases that can arise in prenatal brain formation and cause recognizable symptoms even when present at low levels. Brain malformations associated with these changes are linked to epilepsy and intellectual disability.
Humans can also be mosaics of ‘foreign’ genomes. Rare cases of confounded identities brought to light the first examples. In one case, a woman needing a kidney transplant did not genetically match her children; her kidney grew from the cells of her lost twin brother. In another case, the identity of a criminal was masked because cells from his bone marrow transplant had migrated into the lining of his cheek. Cheek swabs were taken for his DNA test. Even more remarkable, observations suggest that many women who have been pregnant might be genomic chimeras. In samples from brain autopsies of 59 women, for example, 63 per cent of neurons contained Y chromosomes originating from their male offspring (actually from the fathers).
Doctors and geneticists are just starting to explore what having a multiplicity of genomes means for human health. At this point they are busy mapping the extent of the phenomenon but the message is already loud and clear: genomics continues to astonish us and genomic diversity is appearing everywhere we imagine to look, including inside our own bodies.
Beyond genomics, epigenomics is perhaps an even higher mountain of diversity to scale. Genomes might be relatively static entities at the level of their nucleotides A, C, G, and T, but the double helix can be decorated in numerous ways that change how genes are turned on and off, and in which combinations.
In essence, exactly the same genome sequence can have very different effects depending on its history and context. Gene expression patterns can change frequently, and in some cases the modifications are even passed on to the next generation. It never ends. Human genetic variation continues to blindside us with its enormity and complexity.”
Field, Dawn; Davies, Neil (2015-01-31). Biocode: The New Age of Genomics (pp. 33-34). Oxford University Press. Kindle Edition.
I have a couple of concerns. Docs often don’t have enough time TODAY to get through an electronic SOAP note effectively, given workflow constraints. Adding in torrents of “omics” data may be problematic, both in terms of the sheer number of additional potential dx/tx variables to be considered in a short amount of time, and questions of “omics” analytic competency. To that latter point, what will even constitute dx "competency" in the individual patient dx context, given the relative infancy of the research domain? (Not to mention issues of genomic lab QC/QA -- a particular focus that I will have, in light of my 80's lab QC background).
President Obama’s current infatuation with “Precision Medicine” notwithstanding, just dumping bunch of “omics” data into EHRs (insufficiently vetted for accuracy and utility, and inadequately understood by the diagnosing clinician) is likely to set us up for our latest HIT disappointment -- and perhaps injure patients in the process.
___
More to come...
Friday, May 22, 2015
The Robot will see you now -- assuming you can pay
If artificial intelligence and robotics replace most jobs, as looks increasingly likely in light of the evidence proffered below, how will you pay for any of life's necessities (including health care)? Those who crow moralistically about "the right to life" are rather silent when it comes to also asserting an equitable "right to life's requisite resources."
From The New Yorker, TOMORROW’S ADVANCE MAN: Marc Andreessen’s plan to win the future.
In “Why Software Is Eating the World,” a widely invoked 2011 op-ed in the Wall Street Journal, Andreessen put the most optimistic spin on Silicon Valley’s tendencies. The article proclaimed that tech companies are consuming vast swaths of the economy, from books and movies to financial services to agriculture to national defense—which Andreessen saw as the healthful scavenging of a carrion way of life. On Twitter, he pursued the theme: “Posit a world in which all material needs are provided free, by robots and material synthesizers. . . . Imagine six, or 10, billion people doing nothing but arts and sciences, culture and exploring and learning. What a world that would be,” particularly as “technological progress is precisely what makes a strong, rigorous social safety net affordable.”I, for one, would love to be doing nothing but "arts and sciences, culture and exploring and learning" while living amid the continuing first-world comforts that have thus far characterized my life. The developed world, however, may be in some some very large rude awakenings.
I have reported at some length (much of it skeptical) on the gushing enthusiasm over VC funding of the Health IT space. apropos of the topic, the book I just finished.
Highly recommended. A thoughtful, disturbing read. My Amazon review.
I read a ton of books, and cite/excerpt/link them on my main blog (Blog.KHIT.org). Rarely have I encountered one as oddly enjoyable as this one. Amiably well-written, amply documented, throwing down clear logic, and plausible scenario conjectures (based on his -- always guarded -- interpretations of his broad and deep documentation).I was initially mainly interested in his healthcare chapter, but, the entire book is a major wake-up call. I will let the author speak for himself, excerpting his Introduction.
Humans seem to be sleepwalking into a future regarding which they are largely unprepared. Will we slide into a dystopian techno-neo-feudalism when the jobs largely evaporate across nearly all sectors, including the cognitive arena "knowledge worker" domains thought to be relatively bulletproof? We are all in this together -- much as our narcissistic one-tenth-of-one-percenters like to deny it. "Wealth" is a relative thing, much as they prefer to deny it.
My initial interest in this book (based on an NPR interview I heard) was that of his take on health care (my area -- Health IT), which I will be writing up forthwith. But, the larger problem is equally if not moreso compelling. How will we all "earn our keep" if reliable employment diminishes by half or more, regardless of our skills and initiative? Are we headed for "Elysium"?
...[The] “golden age” of the American economy was characterized by a seemingly perfect symbiosis between rapid technological progress and the welfare of the American workforce. As the machines used in production improved, the productivity of the workers operating those machines likewise increased, making them more valuable and allowing them to demand higher wages. Throughout the postwar period, advancing technology deposited money directly into the pockets of average workers as their wages rose in tandem with soaring productivity. Those workers, in turn, went out and spent their ever-increasing incomes, further driving demand for the products and services they were producing.A great review of Martin Ford's book at Fast Company:
As that virtuous feedback loop powered the American economy forward, the profession of economics was enjoying its own golden age. It was during the same period that towering figures like Paul Samuelson worked to transform economics into a science with a strong mathematical foundation. Economics gradually came to be almost completely dominated by sophisticated quantitative and statistical techniques, and economists began to build the complex mathematical models that still constitute the field’s intellectual basis. As the postwar economists did their work, it would have been natural for them to look at the thriving economy around them and assume that it was normal: that it was the way an economy was supposed to work— and would always work…
There are good reasons to believe that America’s economic Goldilocks period has likewise come to an end. That symbiotic relationship between increasing productivity and rising wages began to dissolve in the 1970s. As of 2013, a typical production or nonsupervisory worker earned about 13 percent less than in 1973 (after adjusting for inflation), even as productivity rose by 107 percent and the costs of big-ticket items like housing, education, and health care have soared…
On January 2, 2010, the Washington Post reported that the first decade of the twenty-first century resulted in the creation of no new jobs. Zero. This hasn’t been true of any decade since the Great Depression; indeed, there has never been a postwar decade that produced less than a 20 percent increase in the number of available jobs. Even the 1970s, a decade associated with stagflation and an energy crisis, generated a 27 percent increase in jobs. The lost decade of the 2000s is especially astonishing when you consider that the US economy needs to create roughly a million jobs per year just to keep up with growth in the size of the workforce. In other words, during those first ten years there were about 10 million missing jobs that should have been created— but never showed up.
Income inequality has since soared to levels not seen since 1929, and it has become clear that the productivity increases that went into workers’ pockets back in the 1950s are now being retained almost entirely by business owners and investors. The share of overall national income going to labor, as opposed to capital, has fallen precipitously and appears to be in continuing free fall. Our Goldilocks period has reached its end, and the American economy is moving into a new era.
It is an era that will be defined by a fundamental shift in the relationship between workers and machines. That shift will ultimately challenge one of our most basic assumptions about technology: that machines are tools that increase the productivity of workers. Instead, machines themselves are turning into workers, and the line between the capability of labor and capital is blurring as never before.
All this progress is, of course, being driven by the relentless acceleration in computer technology. While most people are by now familiar with Moore’s Law— the well-established rule of thumb that says computing power roughly doubles every eighteen to twenty-four months— not everyone has fully assimilated the implications of this extraordinary exponential progress…
It’s a good bet that nearly all of us will be surprised by the progress that occurs in the coming years and decades. Those surprises won’t be confined to the nature of the technical advances themselves: the impact that accelerating progress has on the job market and the overall economy is poised to defy much of the conventional wisdom about how technology and economics intertwine.
One widely held belief that is certain to be challenged is the assumption that automation is primarily a threat to workers who have little education and lower-skill levels. That assumption emerges from the fact that such jobs tend to be routine and repetitive. Before you get too comfortable with that idea, however, consider just how fast the frontier is moving. At one time, a “routine” occupation would probably have implied standing on an assembly line. The reality today is far different. While lower-skill occupations will no doubt continue to be affected, a great many college-educated, white-collar workers are going to discover that their jobs, too, are squarely in the sights as software automation and predictive algorithms advance rapidly in capability.
The fact is that “routine” may not be the best word to describe the jobs most likely to be threatened by technology. A more accurate term might be “predictable.” Could another person learn to do your job by studying a detailed record of everything you’ve done in the past? Or could someone become proficient by repeating the tasks you’ve already completed, in the way that a student might take practice tests to prepare for an exam? If so, then there’s a good chance that an algorithm may someday be able to learn to do much, or all, of your job. That’s made especially likely as the “big data” phenomenon continues to unfold: organizations are collecting incomprehensible amounts of information about nearly every aspect of their operations, and a great many jobs and tasks are likely to be encapsulated in that data— waiting for the day when a smart machine learning algorithm comes along and begins schooling itself by delving into the record left by its human predecessors.
The upshot of all this is that acquiring more education and skills will not necessarily offer effective protection against job automation in the future. As an example, consider radiologists, medical doctors who specialize in the interpretation of medical images. Radiologists require a tremendous amount of training, typically a minimum of thirteen years beyond high school. Yet, computers are rapidly getting better at analyzing images. It’s quite easy to imagine that someday, in the not too distant future, radiology will be a job performed almost exclusively by machines.
In general, computers are becoming very proficient at acquiring skills, especially when a large amount of training data is available. Entry-level jobs, in particular, are likely to be heavily affected, and there is evidence that this may already be occurring. Wages for new college graduates have actually been declining over the past decade, while up to 50 percent of new graduates are forced to take jobs that do not require a college degree. Indeed, as I’ll demonstrate in this book, employment for many skilled professionals— including lawyers, journalists, scientists, and pharmacists— is already being significantly eroded by advancing information technology. They are not alone: most jobs are, on some level, fundamentally routine and predictable, with relatively few people paid primarily to engage in truly creative work or “blue-sky” thinking.
As machines take on that routine, predictable work, workers will face an unprecedented challenge as they attempt to adapt. In the past, automation technology has tended to be relatively specialized and to disrupt one employment sector at a time, with workers then switching to a new emerging industry. The situation today is quite different. Information technology is a truly general-purpose technology, and its impact will occur across the board. Virtually every industry in existence is likely to become less labor-intensive as new technology is assimilated into business models— and that transition could happen quite rapidly…
All of this suggests that we are headed toward a transition that will put enormous stress on both the economy and society. Much of the conventional advice offered to workers and to students who are preparing to enter the workforce is likely to be ineffective. The unfortunate reality is that a great many people will do everything right— at least in terms of pursuing higher education and acquiring skills— and yet will still fail to find a solid foothold in the new economy.
Beyond the potentially devastating impact of long-term unemployment and underemployment on individual lives and on the fabric of society, there will also be a significant economic price. The virtuous feedback loop between productivity, rising wages, and increasing consumer spending will collapse. That positive feedback effect is already seriously diminished: we face soaring inequality not just in income but also in consumption. The top 5 percent of households are currently responsible for nearly 40 percent of spending, and that trend toward increased concentration at the top seems almost certain to continue. Jobs remain the primary mechanism by which purchasing power gets into the hands of consumers. If that mechanism continues to erode, we will face the prospect of having too few viable consumers to continue driving economic growth in our mass-market economy.
As this book will make clear, advancing information technology is pushing us toward a tipping point that is poised to ultimately make the entire economy less labor-intensive. However, that transition won’t necessarily unfold in a uniform or predictable way. Two sectors in particular— higher education and health care— have, so far, been highly resistant to the kind of disruption that is already becoming evident in the broader economy. The irony is that the failure of technology to transform these sectors could amplify its negative consequences elsewhere, as the costs of health care and education become ever more burdensome…
Ford, Martin (2015-05-05). Rise of the Robots: Technology and the Threat of a Jobless Future. Basic Books. Kindle Edition. [Locations 55 - 152]
Yes, Robots Really Are Going To Take Your Job And End The American Dream
Now that machines can diagnose cancer, trade stocks, and write symphonies, they're not just going to make humans more efficient as they have in the past—they are replacing them entirely and wrecking the economy along the way.
There's nothing new about fears of technological unemployment. The idea goes back to the Luddites in 18th-century England and John Maynard Keynes in the 1930s. Union bosses have long railed against factory automation, and governments have even resisted technology to maintain higher job levels. Yet predictions that machines would put humans out of work on a significant societal scale have never quite materialized.
However, there's reason to be believe that, unlike those previous times, we really are entering an age when people will work less. As author Martin Ford puts it in his recent book Rise of the Robots, "this time is different." New artificially intelligent machines, he says, are not so much tools to improve the efficiency of workers but really are tools to replace workers themselves...
Surveying all the fields now being affected by automation, Ford makes a compelling case that this is an historic disruption—a fundamental shift from most tasks being performed by humans to one where most tasks are done by machines. That includes obvious things like moving boxes around a warehouse, but also many "higher skill" jobs as well, such as radiology and stock trading. And don't kid yourself about your own importance: that list almost certainly includes your job.
We really could be headed for an economy with many fewer jobs in it and a severely eroded middle class, he argues. Together with other important trends like wealth inequality and globalization, new technology threatens to produce more unemployment and slow the main motor of the U.S. economy—consumer demand...
Of course, those people doing the automating may stand to do well financially—but perhaps not for long. Ultimately, Ford argues, complete automation will be bad for the economy because machines don't consume goods and services the way human beings do. The "powerful symbiosis between rising incomes and robust, broad-based consumer demand is now in the process of unwinding," he says...
The standard response to automation among economists has been to call for more education, so low-paid workers can move up the food chain. But Ford doesn't think that will help ultimately. Many people are already over-educated for what they do—just look at all the college graduates serving coffee in Starbucks.
Ford says cramming everyone into jobs requiring more skills is "analogous to believing that, in the wake of the mechanization of agriculture, the majority of displaced farm workers would be able to find jobs driving tractors." Nor, can we hope to stop the automation wave, he says. There's an inevitability to these technologies, and it's inevitable that businesses will take advantage. Whatever employers might say publicly, they don't really want to hire more people than they need...
See my post of last year, "Aye, Robot."
Will doctors and nurses (along with the broader allied health workforce) become casualties of increasingly widespread deployment of robotization and artificial intelligence? Read again my April 22nd post "Nurses and doctors in the trenches." Scroll down and re-read in particular the section on the Weeds' "Medicine in Denial."
Back to the New Yorker Andreessen article:
Last year, a programmer named Alex Payne wrote an open letter to Andreessen in which he observed, “People are scared of so much wealth and control being in so few hands. Consequently, wherever you and other gatekeepers of capital direct your attention—towards robots, 3D printers, biotech, whatever—you’re going to detect a fearful response as people scramble to determine the impact of your decisions and whims,” which only compound “lingering structural unemployment and an accumulation of capital at the top of the economic pyramid.”Easy for him to say. He won't be missing any meals.
Payne addressed his thoughts to Andreessen because Andreessen represents the Valley—both in its soaring vision and in its tendency to treat people as a fungible mass. But Andreessen waved away the criticisms as the ravings of “a self-hating software engineer.”...
Global unemployment is rising, too—this seems to be the first industrial revolution that wipes out more jobs than it creates. One 2013 paper argues that forty-seven per cent of all American jobs are destined to be automated. Andreessen argues that his firm’s entire portfolio is creating jobs, and that such companies as Udacity (which offers low-cost, online “nanodegrees” in programming) and Honor (which aims to provide better and better-paid in-home care for the elderly) bring us closer to a future in which everyone will either be doing more interesting work or be kicking back and painting sunsets. But when I brought up the raft of data suggesting that intra-country inequality is in fact increasing, even as it decreases when averaged across the globe—America’s wealth gap is the widest it’s been since the government began measuring it—Andreessen rerouted the conversation, saying that such gaps were “a skills problem,” and that as robots ate the old, boring jobs humanity should simply retool...
Chapter 6 of The Rise of the Robots ("The Health Care Challenge") is so good, I could easily end up citing all of it, which would put me at risk of Fair Use excess. Buy it and read it. You won't be disapppointed.
I'll just skip over to the next chapter.
Chapter 7
Consider General Motors by way of comparison. Current market cap, ~$57.18 billion, 202,000 employees. A valuation of roughly $283,000 per employee.TECHNOLOGIES AND INDUSTRIES OF THE FUTURE
YouTube was founded in 2005 by three people. Less than two years later, the company was purchased by Google for about $1.65 billion. At the time of its acquisition, YouTube employed a mere sixty-five people, the majority of them highly skilled engineers. That works out to a valuation of over $25 million per employee. In April 2012, Facebook acquired photo-sharing start-up Instagram for $ 1 billion. The company employed thirteen people. That’s roughly $77 million per worker. Fast-forward another two years to February 2014 and Facebook once again stepped up to the plate, this time purchasing mobile messaging company WhatsApp for $19 billion. WhatsApp had a workforce of fifty-five— giving it a valuation of a staggering $345 million per employee.
Soaring per-employee valuations are a vivid demonstration of the way accelerating information and communications technology can leverage the efforts of a tiny workforce into enormous investment value and revenue. What’s more, they offer compelling evidence for how the relationship between technology and employment has changed. There is a widely held belief— based on historical evidence stretching back at least as far as the industrial revolution— that while technology may certainly destroy jobs, businesses, and even entire industries, it will also create entirely new occupations, and the ongoing process of “creative destruction” will result in the emergence of new industries and employment sectors— often in areas that we can’t yet imagine. A classic example is the rise of the automotive industry in the early twentieth century, and the corresponding demise of businesses engaged in manufacturing horse-drawn carriages.
As we saw in Chapter 3, however, information technology has now reached the point where it can be considered a true utility, much like electricity. It seems nearly inconceivable that successful new industries will emerge that do not take full advantage of that powerful new utility, as well as the distributed machine intelligence that accompanies it. As a result, emerging industries will rarely, if ever, be highly labor-intensive. The threat to overall employment is that as creative destruction unfolds, the “destruction” will fall primarily on labor-intensive businesses in traditional areas like retail and food preparation, while the “creation” will generate new businesses and industries that simply don’t hire many people. In other words, the economy is likely on a path toward a tipping point where job creation will begin to fall consistently short of what is required to fully employ the workforce. [op cit pp. 175-176].
The Robot will see you now -- assuming you can pay.
More Martin Ford:
Is Economic Growth Sustainable as Inequality Soars?The author concludes:
As we’ve seen, overall consumer spending in the United States has so far continued to grow even as it has become ever more concentrated, with the top 5 percent of households now responsible for nearly 40 percent of total consumption. The real question is whether that trend is likely to be sustainable in the coming years and decades, as information technology continues its relentless acceleration.
While the top 5 percent have relatively high incomes, the vast majority of these people are heavily dependent on jobs. Even within these top-tier households, income is concentrated to a staggering degree; the number of genuinely wealthy households— those that can survive and continue spending entirely on the basis of their accumulated wealth— is far smaller. During the first year of recovery from the Great Recession, 95 percent of income growth went to just the top 1 percent.
The top 5 percent is largely made up of professionals and knowledge workers with at least a college degree. As we saw in Chapter 4, however, many of these skilled occupations are squarely in the crosshairs as technology advances. Software automation may eliminate some jobs entirely. In other cases, the jobs may end up being deskilled, so that wages are driven down. Offshoring and the transition to big data– driven management approaches that often require fewer analysts and middle managers loom as other potential threats for many of these workers. In addition to directly impacting households that are already in the top tier, these same trends will also make it harder for younger workers to eventually move up into positions with comparable income and spending levels.
The bottom line is that the top 5 percent is poised to increasingly look like a microcosm of the entire job market: it is at risk of itself being hollowed out. As technology progresses, the number of American households with sufficient discretionary income and confidence in the future to engage in robust spending could well continue to contract. The risk is further increased by the fact that many of these top-tier households are probably more financially fragile than their incomes might suggest. [ibid, pp. 212-213].
In the worst-case scenario, a combination of widespread economic insecurity, drought, and rising food prices could eventually lead to social and political instability.Yeah, "drought." I've been all over that as well lately.
The greatest risk is that we could face a “perfect storm”— a situation where technological unemployment and environmental impact unfold roughly in parallel, reinforcing and perhaps even amplifying each other. If, however, we can fully leverage advancing technology as a solution— while recognizing and adapting to its implications for employment and the distribution of income— then the outcome is likely to be far more optimistic. Negotiating a path through these entangled forces and crafting a future that offers broad-based security and prosperity may prove to be the greatest challenge for our time. [ibid, pp. 283-284].
See what you think about Rise of the Robots. Listen to the NPR Fresh Air interview with the author that I put in my prior post if you've not already done so.
__
AFTERTHOUGHTS
Among the thinkers/writers that Martin Ford has cited in Rise of the Robots that I have cited and excerpted on this blog are Peter Thiel (Zero to One) and Nicholas Carr (The Glass Cage) (scroll down). Some useful triangulations therein, as you might also tangentially find in my March 9th, 2015 post.
In his Fast Company article cited above, Ben Schiller notes that
[Martin Ford closes by making] the case for a basic income guarantee—a government payment to all citizens so they can live to a reasonable level. His version would be tied to educational accomplishment. People who get at least a high school diploma would get slightly more money, on the thinking that not having at least a diploma in the future economy will make people even less employable than they are today. He suggests $10,000 per person (which is lower than many other proposals), which would cost about $1 trillion overall, provided the payment was means-tested at the top-end.Yeah. Try selling that idea to the GOP currently in control of Congress. And, should the GOP take the White House in 2016 while keeping control of The Hill, we'll be headed oligarchically (suicidally?) in the other direction. We seem firmly in the grip of the "Persecutor" phase of the sociopolitical Rescuer-Victim-Persecutor Triangle.
This might become an economic necessity, he says, if work is no longer an option for large numbers of people. "If we look into the future and assume that machines will eventually replace human labor to a substantial degree, then I think some form of direct redistribution of purchasing power becomes essential if economic growth is to continue."
Ford used the word "plutonomy" in his book. It fits.
ouroboros
The words "conundrum," "dilemma," "quandary," and "paradox" come to mind. The short-term "First Mover Advantage" accruing to those implementing full-bore tech automation, so nicely set forth in Martin Ford's book, cannot but be unsustainable. An unrestrained "free market uber alles" ethos inescapably bears the DNA of its own demise. Our Calvinist heritage may now be past its sell-by date.
UPDATE
Pretty much sums up Martin Ford's book.
Another take, from the PBS Newshour:
Good topical resource on the Wiki, btw: "Technological Unemployment"
Technological unemployment is unemployment primarily caused by technological change. Early concern about technological unemployment was exemplified by the Luddites, textile workers who feared that automated looms would allow more productivity with fewer workers, leading to mass unemployment. But while automation did lead to textile workers being laid off, new jobs in other industries developed. Due to this shift of labor from automated industries to non-automated industries, technological unemployment has been called the Luddite fallacy.Speaking of "work." See "WHAT MOTIVATES US AT WORK? MORE THAN MONEY"
Modern proponents of the technological unemployment concept argue that productivity has been decoupling from employment throughout the 21st century, as increasing numbers of industries are automating simultaneously. They point to studies showing that the job losses are most concentrated in occupations involving routine physical and mental labor, the jobs that are easiest to automate. The chief insight is that even though it is true that new types of jobs can always develop, the skills of most people may not be adequate to fill many of them. Various ideas on how to bypass this problem exist...
__
From an L.A. Times review article:
[W]hat's most interesting about "Rise of the Robots" is that it isn't actually a narrative about the imminent triumph of soulless automatons. Robots aren't the enemy. The real villain here is capitalism: a stupid form of capitalism that seems dead set on destroying itself. It would be ironic if it weren't our own future that was in peril. The relentless drive by capital to cut costs and boost profits is threatening to destroy the wellspring of economic growth that capitalism requires. Because when there are no jobs for humans, there will be no consumers with the disposable income to buy the products being produced so efficiently by robots. Henry Ford understood this when he paid his workers high enough wages to buy his cars. Today's titans of the economy appear to have forgotten the lesson..."Winner-take-all" means winner loses all in the end, fevered, sophomoric, narcissistic Libertarian fantasies notwithstanding.
To that point (and our broader theme here),
...Much of the literature on post-capitalist economies is preoccupied with the problem of managing labor in the absence of capitalist bosses. However, I will begin by assuming that problem away, in order to better illuminate other aspects of the issue. This can be done simply by extrapolating capitalism’s tendency toward ever-increasing automation, which makes production ever-more efficient while simultaneously challenging the system’s ability to create jobs, and therefore to sustain demand for what is produced. This theme has been resurgent of late in bourgeois thought: in September 2011, Slate’s Farhad Manjoo wrote a long series on “The Robot Invasion,” and shortly thereafter two MIT economists published Race Against the Machine, an e-book in which they argued that automation was rapidly overtaking many of the areas that until recently served as the capitalist economy’s biggest motors of job creation. From fully automatic car factories to computers that can diagnose medical conditions, robotization is overtaking not only manufacturing, but much of the service sector as well.From "Four Futures"
Taken to its logical extreme, this dynamic brings us to the point where the economy does not require human labor at all. This does not automatically bring about the end of work or of wage labor, as has been falsely predicted over and over in response to new technological developments. But it does mean that human societies will increasingly face the possibility of freeing people from involuntary labor. Whether we take that opportunity, and how we do so, will depend on two major factors, one material and one social. The first question is resource scarcity: the ability to find cheap sources of energy, to extract or recycle raw materials, and generally to depend on the Earth’s capacity to provide a high material standard of living to all. A society that has both labor-replacing technology and abundant resources can overcome scarcity in a thoroughgoing way that a society with only the first element cannot. The second question is political: what kind of society will we be? One in which all people are treated as free and equal beings, with an equal right to share in society’s wealth? Or a hierarchical order in which an elite dominates and controls the masses and their access to social resources?...
Interesting heads-up, thanks to Margalit Gur-Arie
Turns out that Martin Ford's concerns were aired 63 years ago by Kurt Vonnegut.
Player Piano (1952), Vonnegut’s first novel, embeds and foreshadows themes which are to be parsed and dramatized by academians for centuries to come. His future society--a marginal extrapolation, Vonnegut wrote, of the situation he observed as an employee of General Electric in which machines were replacing people increasingly and without any regard for their fate--is mechanistic and cruel, indifferent to human consequence, almost in a state of merriment as human wreckage accumulates. Paul Proteus, the novel’s protagonist, is an engineer at Ilium Works and first observes with horror and then struggles to reverse the displacement of human labor by machines.ON "TECH" - I LOVE MY NEW YORKER
___
More to come...
Wednesday, May 20, 2015
Belated Happy Birthday to Me: The KHIT blog turns 5
Forgot to mention this. I started this blog on May 10, 2010, shortly after coming back to work at HealthInsight after they'd won the Meaningful Use REC contract for Nevada and Utah.
Notwithstanding that I "retired" two years ago, sold the house in Vegas, and moved back to the Bay Area where my wife works, I keep plugging away. No sponsors, no monetization of any sort. I just find the aggregate subject matter important ongoing, and, as the blog approaches 400,000 hits, I guess others do as well.
Thanks for reading.
I just got back from a quick vacation back to Las Vegas, and will soon have another book to report on.
The Health IT implications are obvious.
The total amount of information that could potentially be useful to a physician attempting to diagnose a particular patient’s condition or design an optimal treatment strategy is staggering. Physicians are faced with a continuous torrent of new discoveries, innovative treatments, and clinical study evaluations published in medical and scientific journals throughout the world.
Ford, Martin (2015-05-05). Rise of the Robots: Technology and the Threat of a Jobless Future (p. 147). Basic Books. Kindle Edition.A bracing read. Stay tuned.
The author interviewed on NPR's "Fresh Air"
TERRY GROSS, HOST:
This is FRESH AIR. I'm Terry Gross.
Remember when you first saw a self-checkout aisle at a grocery store? We use them all the time now without giving much thought to the fact that they're doing work real people used to do. Our guest, Martin Ford, says that's something we'll have to think about more as advances in computer software dramatically expand the roles robots may soon play in our lives and the economy. In his new book, Ford says robots are beginning to perform white-collar jobs, like writing business stories and doing legal analysis. And, he says, robots are also being developed to take the place of the low-wage, low-skill service occupations that workers who've lost manufacturing jobs have been forced to take, like preparing fast food. Ford says accelerating robot technology could have far-reaching social and economic impacts, as even educated workers can't find employment, and their declining incomes lead to a collapse of consumer demand. Martin Ford is the founder of a Silicon Valley-based software development firm and the author of an earlier book about automation and the economy. FRESH AIR contributor Dave Davies spoke to Ford about his new book, "Rise Of The Robots: Technology And The Threat Of A Jobless Future."Great stuff. Full transcript here.
And, oh, yeah, check out my new drought page.
___
More to come...
Thursday, May 14, 2015
"Clinical software requires models of work, and not simply models of data"
"Electronic records, in concept and rendition, address most directly those aspects of clinical care that are addressed by paper charts. Attempting to force EHR systems to exceed their original design conceptualizations will only continue to disrupt the work of busy clinicians while delaying serious investigation into better metaphors for clinical software designs."I closely follow the posts of Jerome Carter, MD over at "EHRscience.com." It is an excellent resource. The foregoing comes from his May 11th, 2015 post. While the ARRA/HITECH Meaningful Use is not getting any love these days (particularly in Congress), Dr. Carter has a differing, more nuanced view.
"By encouraging adoption of HIT, the EHR incentive programs have forced everyone to deal with the realities of current EHR system designs. In fact, this might be one of the most significant outcomes of the entire program. Now that there is a general acceptance that HIT needs to evolve in order to better support clinical work, more research is becoming available that details how clinical professionals work and the nature of their information management needs.One of my principal jobs while with my REC was that of assisting my Meaningful Use participant clients with workflow adjustments sufficient to assimilate the criteria into their normal workflows without costing them additional labor expense just to chase the incentive money (along with trying to help them improve the efficiency and effectiveness of their clinical processes more broadly in pursuit of the longer-term/bigger picture).
Prior to the incentive programs, there was a mostly unchallenged assumption that EHR systems were capable of supporting clinical work needs. The national experiment in EHR adoption has proven this assumption to be false — but not in a bad way (I have no interest in bashing current products). Rather, in disproving the initial assumption, the incentive programs have engendered a level of research into clinical work, interoperability, safety, and clinical software design that I doubt would have ever happened absent the ONC’s efforts..."
Click to enlarge.
I wish I could say I was routinely effective in this regard, but, candidly, my wins were relatively few and far between. Most of my MU caseload comprised economically-stressed small-shop (1-3 docs) outpatient primary care (including a bunch of Medicaid). They were all mostly content to just get through the day, and had little time or inclination to devote to QI initiatives.
I suspect that's still largely the case.
Click-through path-shortening to get at the MU criteria more quickly was about the only thing staff and docs were interested in (and I used to gripe to ONC and vendors that all MU numerator criteria data fields could and should be one-click macro-accessible), and, given that our REC "milestone" funding was tied to getting clients to Attestation, our short-term incentive was to play ball. Lofty talk about significant and sustainable clinical workflow and outcomes improvements remained just that -- high-minded QIO and REC talk.
The rigidities of the "certified" EHRs didn't help matters, nor did the fact that, as a "vendor neutral" REC we had to support about 40 platforms -- all replete with vendors' non-disclosure agreements (I had about 8 of them in my book, mostly eCW and e-MDs, along with a smattering of one-sie's/two-sie's).
More Jerome Carter:
HIT vendors could roll their own workflow tech or build static workflow capability into systems; however, the day when either made sense has surely passed. EHR systems have static workflows and adding new code to current EHR systems every time a new work support need arises makes no sense. Workflow technology is the only currently available approach that makes the problem tractable. The downside of this realization is that current systems would need substantial rewrites to make use of workflow tech; the upside is that the future is coming — no matter what.I hope Dr. Carter gets some national traction with his work. I am a bit dismayed that the research into "clinical work support" has yet to become the priority it should be. And, I have to take seriously Margalit Gur-Arie's recent assertion that our out-of-focus "interoperability" obsession may be The Tail Wagging The Dog.
On the interop topic,
Stiff interoperability penalties in new 21st Century Cures ActInteresting that the JAMIA research paper Dr. Carter cites in his post also mentions "interoperability."
By Darius Tahir | May 13, 2015
WASHINGTON—Electronic health-record vendors would have until Jan. 1, 2018, before facing stiff penalties for a lack of data-sharing in the newest version of the 21st Century Cures Act.
The addition is the latest to the smorgasbord of incentives, appropriations, deregulations and regulatory streamlining that make up the 21st Century Cures Act, a legislative package developed by the House Energy & Commerce Committee and intended to kick-start biomedical innovation.
The revised bill—released Wednesday morning—is difficult to digest in one sitting, and touches on many areas of controversy. For example, consumer advocates may be concerned by the restoration of exclusivity periods for drugs treating rare or pediatric diseases, which had drawn criticism in previous iterations of the bill.
But the most significant addition may be the tough interoperability provisions for EHR vendors, intended to ensure that there are no technical or economic barriers obstructing providers from sharing clinical data...
Interoperability
Participants emphasized the need to exchange clinical data with providers within and outside of their own PCMH networks. All PCMH participants expressed frustration with and concern about their ability to appropriately coordinate care and meet PCMH reporting requirements using current HIE. This was particularly challenging for each of the two PCMH organizations (A and B) in which practice sites could use different EHRs. Although both PCMHs were sending clinical data to partnering RHIOs, participants from PCMHs and RHIOs each discussed having to dedicate resources to clean and harmonize CCD-based data. The challenges at these two organizations led the PCMH participants and non-vendor stakeholders to conclude that their quality and cost-containment efforts would have been more effective had each PCMH mandated use of the same EHRs. PCMH C noted that its health system’s providers had difficulty exchanging data with specialists outside the system. EHR vendor participants also recognized there were barriers to interoperability using today’s CCD standard, which could limit care coordination among PCMHs.
PCMH participants expressed a need for low-cost solutions for HIE interfaces to achieve effective interoperability. Given that PCMHs A and B maintained multiple EHRs, they noted that the costs were too high for them to pay vendors to develop and maintain HIE interfaces between those EHRs. Still, the reported needing lower cost HIE interfaces extended to PCMH C, which used a single EHR system: “The maintenance of all those interfaces – the costs are ridiculous …”
...As discussed previously, participants described a growing need to support data sharing efforts among PCMHs both within and outside of their health systems, for collaborative efforts. However, challenges with interoperability – including uniform agreement on the types of data to collect, how the data would be collected, and how the data would be stored for eventual reporting requirements – impede such data sharing. This issue speaks to acknowledged needs for increased standardization in data representations...
CONCLUSIONSA good paper, but I found myself chafing at the lack of any detailed discussion as to what precisely constitutes a "clinical support model," one, for purposes here, going to the increasing imperative of "care coordination." Many industrial/manufacturing workflows, and even many service sector workflows, are relatively easy to model and map, given that task variation requiring extensive logic "conditionals" -- branching and looping amid sequential and parallel tasks comprise relatively small exceptions to the aggregate start-to-finish process flows, rather than the norm -- as some would argue to be the prevalent, irreducible circumstance in health care. It is that apparent latter reality that leads, IMO, to frustration with the "hard-coding rigidity" of the current generation EHRs, which are not modular at the task level and endlessly and effortlessly "Lego block" re-configurable.
We conducted interviews with PCMHs, health IT vendors, and associated stakeholders to assess, based on their experiences, how health IT could better support future efforts towards care coordination. We discovered important needs for monitoring tools that enable PCMHs to analyze patient populations; notification tools that enable PCMHs and care managers to detect or be alerted to patterns in their patient populations; collaboration tools that bring together providers and non-providers within as well as across care settings; data extraction tools that enable more efficient reporting; and robust HIE that enables these activities to be achieved efficiently and economically. We encourage further efforts between PCMHs and health IT vendors to focus on these areas of needs in order to improve ongoing and future care coordination efforts.
Another concern is an old one: the "productivity treadmill" reimbursement system that continues to stymie the best efforts at rational and effective "clinical support model" health IT.
Finally, if you're going to derive progressive "models" leading to revised, putatively more efficient and effective "clinical care" policies and procedures devolving down to staff in the trenches, you'd better be able to sell them to staff in a language they can grasp and willingly productively implement. And, ideally, under the "Lean" philosophy to which I ascribe, in-the-trenches staff should be the ones identifying the care support models in the first place.
__
ERRATA
My side project blog concerning the California drought continues.
IDC-10 WATCH
Lordy. Seriously, dude?
___
More to come...
Monday, May 11, 2015
So-called CQMs: They're "Process Indicators," not "Clinical Quality Measures"
Last week I attended an online webinar conducted by Dr. John Toussaint. The content went to material in his soon-forthcoming new book "Management on the Mend."
Five years after his debut book, On the Mend, showed how a large, cradle-to-grave health system revolutionized the way care is delivered, Dr. John Toussaint returns with news for healthcare leaders. There is a right way to go about such a transformation. And senior leaders need to be far more intimately involved.During the post-presentation Q and A, someone asked about the much unloved CQM compliance. Dr. Toussaint responded by calling them what they really are -- "process indicators." e.g., what percentage of the time did a provider or clinical organization do x, y, or z. (A multitude of x's, y's, and z's actually.)
While studying and assisting hundreds of organizations transitioning to lean healthcare, Dr. Toussaint witnessed many flaws and triumphs. Those organizations that win – creating better value for patients while removing waste and cost in the system – have senior managers that lead by example at the frontline of care. The best health systems have also discovered ways to engage everyone in solving problems and embracing change.
Management on the Mend is the result of years of investigations by Dr. Toussaint and dozens of healthcare organizations around the world. Using their collective experiences, he has built a model for lean transformations that work. This book describes the model, step by step, through people in 11 organizations who are doing the work. It is the story of many journeys and one conclusion: lean healthcare is not only possible, it is necessary.
As senior leaders look ahead to a future that includes radical changes in patient populations, the economics of healthcare, and patient expectations, everyone knows that health systems must be agile to survive. In order to thrive, they must be able to continuously improve. Here is the roadmap for that future.
Whether doing so contributed to improved individual patient outcomes at the point of care or not.
The CMS web page spiel on CQMs:
Clinical Quality Measures BasicsGotta love how they position "health outcomes" at the top of their bullet list. In the actual current CQM tables documents for both EP's (Eligible Providers - pdf) and EH's (Eligible Hospitals - pdf), however, you only see these:
Clinical quality measures, or CQMs, are tools that help measure and track the quality of health care services provided by eligible professionals, eligible hospitals and critical access hospitals (CAHs) within our health care system. These measures use data associated with providers’ ability to deliver high-quality care or relate to long term goals for quality health care. CQMs measure many aspects of patient care including:
Measuring and reporting CQMs helps to ensure that our health care system is delivering effective, safe, efficient, patient-centered, equitable, and timely care.
- health outcomes
- clinical processes
- patient safety
- efficient use of health care resources
- care coordination
- patient engagements
- population and public health
- adherence to clinical guidelines
To participate in the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs and receive an incentive payment, providers are required to submit CQM data from certified EHR technology.
To participate in the Medicare and Medicaid EHR Incentive Programs and receive an incentive payment, providers are required to submit CQM data from certified EHR technology.
- Patient and Family Engagement
- Patient Safety
- Care Coordination
- Population/Public Health
- Efficient Use of Healthcare Resources
- Clinical/Process Effectiveness
Like Dr. Toussaint said, they're "process indicators," and the "effectiveness" part goes mostly to "what % of the time did you do these?" "How 'effective' is your compliance?" Calling them "quality measures" is a stretch (as they pertain to individual health outcomes).
Just like "interoperability" is a misnomer (my "interoperababble" rant). Just like "Meaningful Use" is a misnomer -- what we really need is 'Effective Use," from the POV of the patient.
The obvious intent and big-picture expectation is that by doing x, y, and z (and the voluminous additional "a" through "v" of CQMs), aggregate/statistical patient outcomes will eventually improve.
Critics of CQMs (including critics of those process indicator cousins in the equally unloved PQRS initiative) complain that these compliance measures are actually detrimental to individual patient care, given the additional workflow burdens they impose -- that the paltry "incentive" funds are a net loser. Time and money and staff are finite; resources spent obsessively fiddling with process indicators compliance are resources not accorded the patient on the exam table or in the acute care bed.
I remain conflicted about these things. A lot of smart, experienced people had their learned and lengthy say in the promulgation of such initiatives. And, while I sometimes irascibly call things like CQMs and other compliance measures "Quadrant Three" activities ("urgent, but not important"), even that is simplistic. ("All models are wrong; some models are useful.") Urgency and importance are continuous-scale, dynamic, contextual phenomena (that pesky "variation" that dogs all QI efforts). Nonetheless, we must always ask "urgent for whom?" "important to whom?" "meaningful to whom?"
Moreover, inexorably, "he who pays the piper calls the tune." You don't want to do this stuff? Go Direct-Pay / Concierge (not that doing so will totally absolve you of all regulatory compliance, all your griping about "autonomy" notwithstanding).
Dr. Toussaint was unfazed by the webinar participant's question. Implicit in his response was more or less "All the more reason to go Lean. You'll free up time for process compliance measures that aren't going away anytime soon."
I'm looking forward to getting and reading (and reporting on) the new book. I was all over his first book "On the Mend" five years ago on this blog in my second post.
I've closely studied everything he's put out ever since. Enter "Toussaint" in the upper left search cell atop this blog. You'll be scrolling for quite some time.
I'll be covering the 2015 Lean Healthcare Summit in Dallas in early June, where Dr. Toussaint will be keynoting.
From last week's webinar deck.
"Courage to speak the truth." "Lead with humility." Music to these ears. "Talking Stick," anyone? "Just Culture," anyone? See also, specifically "Physician, Health Thy System."
And, "Doctors and nurses in the trenches." to wit:
Their hard-won, sophisticated, indispensable clinical skills aside, nurses and physicians are just people like the rest of us, people more or less beset by all of the frailties, foibles, insecurities, and neuroses that typically dog us all across the breadth of our lives. The fractious, high-stakes, irreducibly high cognitive burden organizational environments within which they must function are neither of their design nor under their control, and can (and unhappily do) exacerbate interpersonal difficulties that are counterproductive to optimal patient care. I call the syndrome "psychosocial toxicity," and have blogged about it at some length in prior posts.__
It's hardly confined to healthcare, to be sure, but organizational cultural dysfunction in healthcare is ultimately a patient safety issue. To the extent that we continue to view clinical co-workers through "transactional/instrumental," "superior/subordinate" lenses, our improvement efforts will be significantly hampered...
ERRATUM
Started a new blog, wherein I've collated for convenience a bunch of disparate posts on a different topic. See "Western U.S. Drought." I'll be adding resource links as I have time.
___
More to come...
Subscribe to:
Posts (Atom)