Latest HIT news, on the fly...


Friday, May 22, 2015

The Robot will see you now -- assuming you can pay

From The New Yorker, TOMORROW’S ADVANCE MAN: Marc Andreessen’s plan to win the future.
In Why Software Is Eating the World,” a widely invoked 2011 op-ed in the Wall Street Journal, Andreessen put the most optimistic spin on Silicon Valley’s tendencies. The article proclaimed that tech companies are consuming vast swaths of the economy, from books and movies to financial services to agriculture to national defense—which Andreessen saw as the healthful scavenging of a carrion way of life. On Twitter, he pursued the theme: “Posit a world in which all material needs are provided free, by robots and material synthesizers. . . . Imagine six, or 10, billion people doing nothing but arts and sciences, culture and exploring and learning. What a world that would be,” particularly as “technological progress is precisely what makes a strong, rigorous social safety net affordable.”
I, for one, would love to be doing nothing but "arts and sciences, culture and exploring and learning" while living amid the continuing first-world comforts that have thus far characterized my life. The developed world, however, may be in some some very large rude awakenings.

I have reported at some length (much of it skeptical) on the gushing enthusiasm over VC funding of the Health IT space. apropos of the topic, the book I just finished.

Highly recommended. A thoughtful, disturbing read. My Amazon review.
I read a ton of books, and cite/excerpt/link them on my main blog ( Rarely have I encountered one as oddly enjoyable as this one. Amiably well-written, amply documented, throwing down clear logic, and plausible scenario conjectures (based on his -- always guarded -- interpretations of his broad and deep documentation).

Humans seem to be sleepwalking into a future regarding which they are largely unprepared. Will we slide into a dystopian techno-neo-feudalism when the jobs largely evaporate across nearly all sectors, including the cognitive arena "knowledge worker" domains thought to be relatively bulletproof? We are all in this together -- much as our narcissistic one-tenth-of-one-percenters like to deny it. "Wealth" is a relative thing, much as they prefer to deny it.

My initial interest in this book (based on an NPR interview I heard) was that of his take on health care (my area -- Health IT), which I will be writing up forthwith. But, the larger problem is equally if not moreso compelling. How will we all "earn our keep" if reliable employment diminishes by half or more, regardless of our skills and initiative? Are we headed for "Elysium"?
I was initially mainly interested in his healthcare chapter, but, the entire book is a major wake-up call. I will let the author speak for himself, excerpting his Introduction.
...[The] “golden age” of the American economy was characterized by a seemingly perfect symbiosis between rapid technological progress and the welfare of the American workforce. As the machines used in production improved, the productivity of the workers operating those machines likewise increased, making them more valuable and allowing them to demand higher wages. Throughout the postwar period, advancing technology deposited money directly into the pockets of average workers as their wages rose in tandem with soaring productivity. Those workers, in turn, went out and spent their ever-increasing incomes, further driving demand for the products and services they were producing.

As that virtuous feedback loop powered the American economy forward, the profession of economics was enjoying its own golden age. It was during the same period that towering figures like Paul Samuelson worked to transform economics into a science with a strong mathematical foundation. Economics gradually came to be almost completely dominated by sophisticated quantitative and statistical techniques, and economists began to build the complex mathematical models that still constitute the field’s intellectual basis. As the postwar economists did their work, it would have been natural for them to look at the thriving economy around them and assume that it was normal: that it was the way an economy was supposed to work— and would always work…

There are good reasons to believe that America’s economic Goldilocks period has likewise come to an end. That symbiotic relationship between increasing productivity and rising wages began to dissolve in the 1970s. As of 2013, a typical production or nonsupervisory worker earned about 13 percent less than in 1973 (after adjusting for inflation), even as productivity rose by 107 percent and the costs of big-ticket items like housing, education, and health care have soared…

On January 2, 2010, the Washington Post reported that the first decade of the twenty-first century resulted in the creation of no new jobs. Zero. This hasn’t been true of any decade since the Great Depression; indeed, there has never been a postwar decade that produced less than a 20 percent increase in the number of available jobs. Even the 1970s, a decade associated with stagflation and an energy crisis, generated a 27 percent increase in jobs. The lost decade of the 2000s is especially astonishing when you consider that the US economy needs to create roughly a million jobs per year just to keep up with growth in the size of the workforce. In other words, during those first ten years there were about 10 million missing jobs that should have been created— but never showed up.

Income inequality has since soared to levels not seen since 1929, and it has become clear that the productivity increases that went into workers’ pockets back in the 1950s are now being retained almost entirely by business owners and investors. The share of overall national income going to labor, as opposed to capital, has fallen precipitously and appears to be in continuing free fall. Our Goldilocks period has reached its end, and the American economy is moving into a new era.

It is an era that will be defined by a fundamental shift in the relationship between workers and machines. That shift will ultimately challenge one of our most basic assumptions about technology: that machines are tools that increase the productivity of workers. Instead, machines themselves are turning into workers, and the line between the capability of labor and capital is blurring as never before.

All this progress is, of course, being driven by the relentless acceleration in computer technology. While most people are by now familiar with Moore’s Law— the well-established rule of thumb that says computing power roughly doubles every eighteen to twenty-four months— not everyone has fully assimilated the implications of this extraordinary exponential progress…

It’s a good bet that nearly all of us will be surprised by the progress that occurs in the coming years and decades. Those surprises won’t be confined to the nature of the technical advances themselves: the impact that accelerating progress has on the job market and the overall economy is poised to defy much of the conventional wisdom about how technology and economics intertwine.

One widely held belief that is certain to be challenged is the assumption that automation is primarily a threat to workers who have little education and lower-skill levels. That assumption emerges from the fact that such jobs tend to be routine and repetitive. Before you get too comfortable with that idea, however, consider just how fast the frontier is moving. At one time, a “routine” occupation would probably have implied standing on an assembly line. The reality today is far different. While lower-skill occupations will no doubt continue to be affected, a great many college-educated, white-collar workers are going to discover that their jobs, too, are squarely in the sights as software automation and predictive algorithms advance rapidly in capability.

The fact is that “routine” may not be the best word to describe the jobs most likely to be threatened by technology. A more accurate term might be “predictable.” Could another person learn to do your job by studying a detailed record of everything you’ve done in the past? Or could someone become proficient by repeating the tasks you’ve already completed, in the way that a student might take practice tests to prepare for an exam? If so, then there’s a good chance that an algorithm may someday be able to learn to do much, or all, of your job. That’s made especially likely as the “big data” phenomenon continues to unfold: organizations are collecting incomprehensible amounts of information about nearly every aspect of their operations, and a great many jobs and tasks are likely to be encapsulated in that data— waiting for the day when a smart machine learning algorithm comes along and begins schooling itself by delving into the record left by its human predecessors.

The upshot of all this is that acquiring more education and skills will not necessarily offer effective protection against job automation in the future. As an example, consider radiologists, medical doctors who specialize in the interpretation of medical images. Radiologists require a tremendous amount of training, typically a minimum of thirteen years beyond high school. Yet, computers are rapidly getting better at analyzing images. It’s quite easy to imagine that someday, in the not too distant future, radiology will be a job performed almost exclusively by machines.

In general, computers are becoming very proficient at acquiring skills, especially when a large amount of training data is available. Entry-level jobs, in particular, are likely to be heavily affected, and there is evidence that this may already be occurring. Wages for new college graduates have actually been declining over the past decade, while up to 50 percent of new graduates are forced to take jobs that do not require a college degree. Indeed, as I’ll demonstrate in this book, employment for many skilled professionals— including lawyers, journalists, scientists, and pharmacists— is already being significantly eroded by advancing information technology. They are not alone: most jobs are, on some level, fundamentally routine and predictable, with relatively few people paid primarily to engage in truly creative work or “blue-sky” thinking.

As machines take on that routine, predictable work, workers will face an unprecedented challenge as they attempt to adapt. In the past, automation technology has tended to be relatively specialized and to disrupt one employment sector at a time, with workers then switching to a new emerging industry. The situation today is quite different. Information technology is a truly general-purpose technology, and its impact will occur across the board. Virtually every industry in existence is likely to become less labor-intensive as new technology is assimilated into business models— and that transition could happen quite rapidly…

All of this suggests that we are headed toward a transition that will put enormous stress on both the economy and society. Much of the conventional advice offered to workers and to students who are preparing to enter the workforce is likely to be ineffective. The unfortunate reality is that a great many people will do everything right— at least in terms of pursuing higher education and acquiring skills— and yet will still fail to find a solid foothold in the new economy.

Beyond the potentially devastating impact of long-term unemployment and underemployment on individual lives and on the fabric of society, there will also be a significant economic price. The virtuous feedback loop between productivity, rising wages, and increasing consumer spending will collapse. That positive feedback effect is already seriously diminished: we face soaring inequality not just in income but also in consumption. The top 5 percent of households are currently responsible for nearly 40 percent of spending, and that trend toward increased concentration at the top seems almost certain to continue. Jobs remain the primary mechanism by which purchasing power gets into the hands of consumers. If that mechanism continues to erode, we will face the prospect of having too few viable consumers to continue driving economic growth in our mass-market economy.

As this book will make clear, advancing information technology is pushing us toward a tipping point that is poised to ultimately make the entire economy less labor-intensive. However, that transition won’t necessarily unfold in a uniform or predictable way. Two sectors in particular— higher education and health care— have, so far, been highly resistant to the kind of disruption that is already becoming evident in the broader economy. The irony is that the failure of technology to transform these sectors could amplify its negative consequences elsewhere, as the costs of health care and education become ever more burdensome…

Ford, Martin (2015-05-05). Rise of the Robots: Technology and the Threat of a Jobless Future. Basic Books. Kindle Edition. [Locations 55 - 152]
A great review of Martin Ford's book at Fast Company:
Yes, Robots Really Are Going To Take Your Job And End The American Dream
Now that machines can diagnose cancer, trade stocks, and write symphonies, they're not just going to make humans more efficient as they have in the past—they are replacing them entirely and wrecking the economy along the way.

There's nothing new about fears of technological unemployment. The idea goes back to the Luddites in 18th-century England and John Maynard Keynes in the 1930s. Union bosses have long railed against factory automation, and governments have even resisted technology to maintain higher job levels. Yet predictions that machines would put humans out of work on a significant societal scale have never quite materialized.

However, there's reason to be believe that, unlike those previous times, we really are entering an age when people will work less. As author Martin Ford puts it in his recent book Rise of the Robots, "this time is different." New artificially intelligent machines, he says, are not so much tools to improve the efficiency of workers but really are tools to replace workers themselves...

Surveying all the fields now being affected by automation, Ford makes a compelling case that this is an historic disruption—a fundamental shift from most tasks being performed by humans to one where most tasks are done by machines. That includes obvious things like moving boxes around a warehouse, but also many "higher skill" jobs as well, such as radiology and stock trading. And don't kid yourself about your own importance: that list almost certainly includes your job.

We really could be headed for an economy with many fewer jobs in it and a severely eroded middle class, he argues. Together with other important trends like wealth inequality and globalization, new technology threatens to produce more unemployment and slow the main motor of the U.S. economy—consumer demand...

Of course, those people doing the automating may stand to do well financially—but perhaps not for long. Ultimately, Ford argues, complete automation will be bad for the economy because machines don't consume goods and services the way human beings do. The "powerful symbiosis between rising incomes and robust, broad-based consumer demand is now in the process of unwinding," he says...

The standard response to automation among economists has been to call for more education, so low-paid workers can move up the food chain. But Ford doesn't think that will help ultimately. Many people are already over-educated for what they do—just look at all the college graduates serving coffee in Starbucks.

Ford says cramming everyone into jobs requiring more skills is "analogous to believing that, in the wake of the mechanization of agriculture, the majority of displaced farm workers would be able to find jobs driving tractors." Nor, can we hope to stop the automation wave, he says. There's an inevitability to these technologies, and it's inevitable that businesses will take advantage. Whatever employers might say publicly, they don't really want to hire more people than they need...

See my post of last year, "Aye, Robot."

Will doctors and nurses (along with the broader allied health workforce) become casualties of increasingly widespread deployment of robotization and artificial intelligence? Read again my April 22nd post "Nurses and doctors in the trenches." Scroll down and re-read in particular the section on the Weeds' "Medicine in Denial."

Back to the New Yorker Andreessen article:
Last year, a programmer named Alex Payne wrote an open letter to Andreessen in which he observed, “People are scared of so much wealth and control being in so few hands. Consequently, wherever you and other gatekeepers of capital direct your attention—towards robots, 3D printers, biotech, whatever—you’re going to detect a fearful response as people scramble to determine the impact of your decisions and whims,” which only compound “lingering structural unemployment and an accumulation of capital at the top of the economic pyramid.”

Payne addressed his thoughts to Andreessen because Andreessen represents the Valley—both in its soaring vision and in its tendency to treat people as a fungible mass. But Andreessen waved away the criticisms as the ravings of “a self-hating software engineer.”...

Global unemployment is rising, too—this seems to be the first industrial revolution that wipes out more jobs than it creates. One 2013 paper argues that forty-seven per cent of all American jobs are destined to be automated. Andreessen argues that his firm’s entire portfolio is creating jobs, and that such companies as Udacity (which offers low-cost, online “nanodegrees” in programming) and Honor (which aims to provide better and better-paid in-home care for the elderly) bring us closer to a future in which everyone will either be doing more interesting work or be kicking back and painting sunsets. But when I brought up the raft of data suggesting that intra-country inequality is in fact increasing, even as it decreases when averaged across the globe—America’s wealth gap is the widest it’s been since the government began measuring it—Andreessen rerouted the conversation, saying that such gaps were “a skills problem,” and that as robots ate the old, boring jobs humanity should simply retool...
Easy for him to say. He won't be missing any meals.

Chapter 6 of The Rise of the Robots ("The Health Care Challenge") is so good, I could easily end up citing all of it, which would put me at risk of Fair Use excess. Buy it and read it. You won't be disapppointed.

I'll just skip over to the next chapter.
Chapter 7

YouTube was founded in 2005 by three people. Less than two years later, the company was purchased by Google for about $1.65 billion. At the time of its acquisition, YouTube employed a mere sixty-five people, the majority of them highly skilled engineers. That works out to a valuation of over $25 million per employee. In April 2012, Facebook acquired photo-sharing start-up Instagram for $ 1 billion. The company employed thirteen people. That’s roughly $77 million per worker. Fast-forward another two years to February 2014 and Facebook once again stepped up to the plate, this time purchasing mobile messaging company WhatsApp for $19 billion. WhatsApp had a workforce of fifty-five— giving it a valuation of a staggering $345 million per employee. 

Soaring per-employee valuations are a vivid demonstration of the way accelerating information and communications technology can leverage the efforts of a tiny workforce into enormous investment value and revenue. What’s more, they offer compelling evidence for how the relationship between technology and employment has changed. There is a widely held belief— based on historical evidence stretching back at least as far as the industrial revolution— that while technology may certainly destroy jobs, businesses, and even entire industries, it will also create entirely new occupations, and the ongoing process of “creative destruction” will result in the emergence of new industries and employment sectors— often in areas that we can’t yet imagine. A classic example is the rise of the automotive industry in the early twentieth century, and the corresponding demise of businesses engaged in manufacturing horse-drawn carriages. 

As we saw in Chapter 3, however, information technology has now reached the point where it can be considered a true utility, much like electricity. It seems nearly inconceivable that successful new industries will emerge that do not take full advantage of that powerful new utility, as well as the distributed machine intelligence that accompanies it. As a result, emerging industries will rarely, if ever, be highly labor-intensive. The threat to overall employment is that as creative destruction unfolds, the “destruction” will fall primarily on labor-intensive businesses in traditional areas like retail and food preparation, while the “creation” will generate new businesses and industries that simply don’t hire many people. In other words, the economy is likely on a path toward a tipping point where job creation will begin to fall consistently short of what is required to fully employ the workforce. [op cit pp. 175-176].
Consider General Motors by way of comparison. Current market cap, ~$57.18 billion, 202,000 employees. A valuation of roughly $283,000 per employee.

The Robot will see you now -- assuming you can pay.

More Martin Ford:
Is Economic Growth Sustainable as Inequality Soars? 

As we’ve seen, overall consumer spending in the United States has so far continued to grow even as it has become ever more concentrated, with the top 5 percent of households now responsible for nearly 40 percent of total consumption. The real question is whether that trend is likely to be sustainable in the coming years and decades, as information technology continues its relentless acceleration.

While the top 5 percent have relatively high incomes, the vast majority of these people are heavily dependent on jobs. Even within these top-tier households, income is concentrated to a staggering degree; the number of genuinely wealthy households— those that can survive and continue spending entirely on the basis of their accumulated wealth— is far smaller. During the first year of recovery from the Great Recession, 95 percent of income growth went to just the top 1 percent.

The top 5 percent is largely made up of professionals and knowledge workers with at least a college degree. As we saw in Chapter 4, however, many of these skilled occupations are squarely in the crosshairs as technology advances. Software automation may eliminate some jobs entirely. In other cases, the jobs may end up being deskilled, so that wages are driven down. Offshoring and the transition to big data– driven management approaches that often require fewer analysts and middle managers loom as other potential threats for many of these workers. In addition to directly impacting households that are already in the top tier, these same trends will also make it harder for younger workers to eventually move up into positions with comparable income and spending levels.

The bottom line is that the top 5 percent is poised to increasingly look like a microcosm of the entire job market: it is at risk of itself being hollowed out. As technology progresses, the number of American households with sufficient discretionary income and confidence in the future to engage in robust spending could well continue to contract. The risk is further increased by the fact that many of these top-tier households are probably more financially fragile than their incomes might suggest. [ibid, pp. 212-213].
The author concludes:
In the worst-case scenario, a combination of widespread economic insecurity, drought, and rising food prices could eventually lead to social and political instability. 

The greatest risk is that we could face a “perfect storm”— a situation where technological unemployment and environmental impact unfold roughly in parallel, reinforcing and perhaps even amplifying each other. If, however, we can fully leverage advancing technology as a solution— while recognizing and adapting to its implications for employment and the distribution of income— then the outcome is likely to be far more optimistic. Negotiating a path through these entangled forces and crafting a future that offers broad-based security and prosperity may prove to be the greatest challenge for our time. [ibid, pp. 283-284].
Yeah, "drought." I've been all over that as well lately.

See what you think about Rise of the Robots. Listen to the NPR Fresh Air interview with the author that I put in my prior post if you've not already done so.



Among the thinkers/writers that Martin Ford has cited in Rise of the Robots that I have cited and excerpted on this blog are Peter Thiel (Zero to One) and Nicholas Carr (The Glass Cage) (scroll down). Some useful triangulations therein, as you might also tangentially find in my March 9th, 2015 post.

In his Fast Company article cited above, Ben Schiller notes that
[Martin Ford closes by making] the case for a basic income guarantee—a government payment to all citizens so they can live to a reasonable level. His version would be tied to educational accomplishment. People who get at least a high school diploma would get slightly more money, on the thinking that not having at least a diploma in the future economy will make people even less employable than they are today. He suggests $10,000 per person (which is lower than many other proposals), which would cost about $1 trillion overall, provided the payment was means-tested at the top-end.

This might become an economic necessity, he says, if work is no longer an option for large numbers of people. "If we look into the future and assume that machines will eventually replace human labor to a substantial degree, then I think some form of direct redistribution of purchasing power becomes essential if economic growth is to continue."
Yeah. Try selling that idea to the GOP currently in control of Congress. And, should the GOP take the White House in 2016 while keeping control of The Hill, we'll be headed oligarchically (suicidally?) in the other direction. We seem firmly in the grip of the "Persecutor" phase of the sociopolitical Rescuer-Victim-Persecutor Triangle.

Ford used the word "plutonomy" in his book. It fits.


The words "conundrum," "dilemma," "quandary," and "paradox" come to mind. The short-term "First Mover Advantage" accruing to those implementing full-bore tech automation, so nicely set forth in Martin Ford's book, cannot but be unsustainable. An unrestrained "free market uber alles" ethos inescapably bears the DNA of its own demise. Our Calvinist heritage may now be past its sell-by date.


Pretty much sums up Martin Ford's book.

Good topical resource on the Wiki, btw: "Technological Unemployment"
Technological unemployment is unemployment primarily caused by technological change. Early concern about technological unemployment was exemplified by the Luddites, textile workers who feared that automated looms would allow more productivity with fewer workers, leading to mass unemployment. But while automation did lead to textile workers being laid off, new jobs in other industries developed. Due to this shift of labor from automated industries to non-automated industries, technological unemployment has been called the Luddite fallacy.

Modern proponents of the technological unemployment concept argue that productivity has been decoupling from employment throughout the 21st century, as increasing numbers of industries are automating simultaneously. They point to studies showing that the job losses are most concentrated in occupations involving routine physical and mental labor, the jobs that are easiest to automate. The chief insight is that even though it is true that new types of jobs can always develop, the skills of most people may not be adequate to fill many of them. Various ideas on how to bypass this problem exist...

From an L.A. Times review article:
[W]hat's most interesting about "Rise of the Robots" is that it isn't actually a narrative about the imminent triumph of soulless automatons. Robots aren't the enemy. The real villain here is capitalism: a stupid form of capitalism that seems dead set on destroying itself. It would be ironic if it weren't our own future that was in peril. The relentless drive by capital to cut costs and boost profits is threatening to destroy the wellspring of economic growth that capitalism requires. Because when there are no jobs for humans, there will be no consumers with the disposable income to buy the products being produced so efficiently by robots. Henry Ford understood this when he paid his workers high enough wages to buy his cars. Today's titans of the economy appear to have forgotten the lesson...
"Winner-take-all" means winner loses all in the end, fevered, sophomoric, narcissistic Libertarian fantasies notwithstanding.



More to come...

Wednesday, May 20, 2015

Belated Happy Birthday to Me: The KHIT blog turns 5

Forgot to mention this. I started this blog on May 10, 2010, shortly after coming back to work at HealthInsight after they'd won the Meaningful Use REC contract for Nevada and Utah.

Notwithstanding that I "retired" two years ago, sold the house in Vegas, and moved back to the Bay Area where my wife works, I keep plugging away. No sponsors, no monetization of any sort. I just find the aggregate subject matter important ongoing, and, as the blog approaches 400,000 hits, I guess others do as well.

Thanks for reading.

I just got back from a quick vacation back to Las Vegas, and will soon have another book to report on.

The Health IT implications are obvious.
The total amount of information that could potentially be useful to a physician attempting to diagnose a particular patient’s condition or design an optimal treatment strategy is staggering. Physicians are faced with a continuous torrent of new discoveries, innovative treatments, and clinical study evaluations published in medical and scientific journals throughout the world.
Ford, Martin (2015-05-05). Rise of the Robots: Technology and the Threat of a Jobless Future (p. 147). Basic Books. Kindle Edition.
A bracing read. Stay tuned.

The author interviewed on NPR's "Fresh Air"


This is FRESH AIR. I'm Terry Gross.
Remember when you first saw a self-checkout aisle at a grocery store? We use them all the time now without giving much thought to the fact that they're doing work real people used to do. Our guest, Martin Ford, says that's something we'll have to think about more as advances in computer software dramatically expand the roles robots may soon play in our lives and the economy. In his new book, Ford says robots are beginning to perform white-collar jobs, like writing business stories and doing legal analysis. And, he says, robots are also being developed to take the place of the low-wage, low-skill service occupations that workers who've lost manufacturing jobs have been forced to take, like preparing fast food. Ford says accelerating robot technology could have far-reaching social and economic impacts, as even educated workers can't find employment, and their declining incomes lead to a collapse of consumer demand. Martin Ford is the founder of a Silicon Valley-based software development firm and the author of an earlier book about automation and the economy. FRESH AIR contributor Dave Davies spoke to Ford about his new book, "Rise Of The Robots: Technology And The Threat Of A Jobless Future."
Great stuff. Full transcript here.

And, oh, yeah, check out my new drought page.

More to come...

Thursday, May 14, 2015

"Clinical software requires models of work, and not simply models of data"

"Electronic records, in concept and rendition, address most directly those aspects of clinical care that are addressed by paper charts. Attempting to force EHR systems to exceed their original design conceptualizations will only continue to disrupt the work of busy clinicians while delaying serious investigation into better metaphors for clinical software designs."
I closely follow the posts of Jerome Carter, MD over at "" It is an excellent resource. The foregoing comes from his May 11th, 2015 post. While the ARRA/HITECH Meaningful Use is not getting any love these days (particularly in Congress), Dr. Carter has a differing, more nuanced view.
"By encouraging adoption of HIT, the EHR incentive programs have forced everyone to deal with the realities of current EHR system designs. In fact, this might be one of the most significant outcomes of the entire program. Now that there is a general acceptance that HIT needs to evolve in order to better support clinical work, more research is becoming available that details how clinical professionals work and the nature of their information management needs.

Prior to the incentive programs, there was a mostly unchallenged assumption that EHR systems were capable of supporting clinical work needs. The national experiment in EHR adoption has proven this assumption to be false — but not in a bad way (I have no interest in bashing current products). Rather, in disproving the initial assumption, the incentive programs have engendered a level of research into clinical work, interoperability, safety, and clinical software design that I doubt would have ever happened absent the ONC’s efforts..."
One of my principal jobs while with my REC was that of assisting my Meaningful Use participant clients with workflow adjustments sufficient to assimilate the criteria into their normal workflows without costing them additional labor expense just to chase the incentive money (along with trying to help them improve the efficiency and effectiveness of their clinical processes more broadly in pursuit of the longer-term/bigger picture).

Click to enlarge.

I wish I could say I was routinely effective in this regard, but, candidly, my wins were relatively few and far between. Most of my MU caseload comprised economically-stressed small-shop (1-3 docs) outpatient primary care (including a bunch of Medicaid). They were all mostly content to just get through the day, and had little time or inclination to devote to QI initiatives.

I suspect that's still largely the case.

Click-through path-shortening to get at the MU criteria more quickly was about the only thing staff and docs were interested in (and I used to gripe to ONC and vendors that all MU numerator criteria data fields could and should be one-click macro-accessible), and, given that our REC "milestone" funding was tied to getting clients to Attestation, our short-term incentive was to play ball. Lofty talk about significant and sustainable clinical workflow and outcomes improvements remained just that -- high-minded QIO and REC talk.

The rigidities of the "certified" EHRs didn't help matters, nor did the fact that, as a "vendor neutral" REC we had to support about 40 platforms -- all replete with vendors' non-disclosure agreements (I had about 8 of them in my book, mostly eCW and e-MDs, along with a smattering of one-sie's/two-sie's).

More Jerome Carter:
HIT vendors could roll their own workflow tech or build static workflow capability into systems; however, the day when either made sense has surely passed. EHR systems have static workflows and adding new code to current EHR systems every time a new work support need arises makes no sense. Workflow technology is the only currently available approach that makes the problem tractable. The downside of this realization is that current systems would need substantial rewrites to make use of workflow tech; the upside is that the future is coming — no matter what.
I hope Dr. Carter gets some national traction with his work. I am a bit dismayed that the research into "clinical work support" has yet to become the priority it should be. And, I have to take seriously Margalit Gur-Arie's recent assertion that our out-of-focus "interoperability" obsession may be The Tail Wagging The Dog.

On the interop topic,
Stiff interoperability penalties in new 21st Century Cures Act
By Darius Tahir  | May 13, 2015

WASHINGTON—Electronic health-record vendors would have until Jan. 1, 2018, before facing stiff penalties for a lack of data-sharing in the newest version of the 21st Century Cures Act.

The addition is the latest to the smorgasbord of incentives, appropriations, deregulations and regulatory streamlining that make up the 21st Century Cures Act, a legislative package developed by the House Energy & Commerce Committee and intended to kick-start biomedical innovation.

The revised bill—released Wednesday morning—is difficult to digest in one sitting, and touches on many areas of controversy. For example, consumer advocates may be concerned by the restoration of exclusivity periods for drugs treating rare or pediatric diseases, which had drawn criticism in previous iterations of the bill.

But the most significant addition may be the tough interoperability provisions for EHR vendors, intended to ensure that there are no technical or economic barriers obstructing providers from sharing clinical data... 
Interesting that the JAMIA research paper Dr. Carter cites in his post also mentions "interoperability."


Participants emphasized the need to exchange clinical data with providers within and outside of their own PCMH networks. All PCMH participants expressed frustration with and concern about their ability to appropriately coordinate care and meet PCMH reporting requirements using current HIE. This was particularly challenging for each of the two PCMH organizations (A and B) in which practice sites could use different EHRs. Although both PCMHs were sending clinical data to partnering RHIOs, participants from PCMHs and RHIOs each discussed having to dedicate resources to clean and harmonize CCD-based data. The challenges at these two organizations led the PCMH participants and non-vendor stakeholders to conclude that their quality and cost-containment efforts would have been more effective had each PCMH mandated use of the same EHRs. PCMH C noted that its health system’s providers had difficulty exchanging data with specialists outside the system. EHR vendor participants also recognized there were barriers to interoperability using today’s CCD standard, which could limit care coordination among PCMHs.

PCMH participants expressed a need for low-cost solutions for HIE interfaces to achieve effective interoperability. Given that PCMHs A and B maintained multiple EHRs, they noted that the costs were too high for them to pay vendors to develop and maintain HIE interfaces between those EHRs. Still, the reported needing lower cost HIE interfaces extended to PCMH C, which used a single EHR system: “The maintenance of all those interfaces – the costs are ridiculous …”
...As discussed previously, participants described a growing need to support data sharing efforts among PCMHs both within and outside of their health systems, for collaborative efforts. However, challenges with interoperability – including uniform agreement on the types of data to collect, how the data would be collected, and how the data would be stored for eventual reporting requirements – impede such data sharing. This issue speaks to acknowledged needs for increased standardization in data representations...
We conducted interviews with PCMHs, health IT vendors, and associated stakeholders to assess, based on their experiences, how health IT could better support future efforts towards care coordination. We discovered important needs for monitoring tools that enable PCMHs to analyze patient populations; notification tools that enable PCMHs and care managers to detect or be alerted to patterns in their patient populations; collaboration tools that bring together providers and non-providers within as well as across care settings; data extraction tools that enable more efficient reporting; and robust HIE that enables these activities to be achieved efficiently and economically. We encourage further efforts between PCMHs and health IT vendors to focus on these areas of needs in order to improve ongoing and future care coordination efforts.
A good paper, but I found myself chafing at the lack of any detailed discussion as to what precisely constitutes a "clinical support model," one, for purposes here, going to the increasing imperative of "care coordination." Many industrial/manufacturing workflows, and even many service sector workflows, are relatively easy to model and map, given that task variation requiring extensive logic "conditionals" -- branching and looping amid sequential and parallel tasks comprise relatively small exceptions to the aggregate start-to-finish process flows, rather than the norm -- as some would argue to be the prevalent, irreducible circumstance in health care. It is that apparent latter reality that leads, IMO, to frustration with the "hard-coding rigidity" of the current generation EHRs, which are not modular at the task level and endlessly and effortlessly "Lego block" re-configurable.

Another concern is an old one: the "productivity treadmill" reimbursement system that continues to stymie the best efforts at rational and effective "clinical support model" health IT.

Finally, if you're going to derive progressive "models" leading to revised, putatively more efficient and effective "clinical care" policies and procedures devolving down to staff in the trenches, you'd better be able to sell them to staff in a language they can grasp and willingly productively implement. And, ideally, under the "Lean" philosophy to which I ascribe, in-the-trenches staff should be the ones identifying the care support models in the first place.


My side project blog concerning the California drought continues.


Lordy. Seriously, dude?

More to come...

Monday, May 11, 2015

So-called CQMs: They're "Process Indicators," not "Clinical Quality Measures"

Last week I attended an online webinar conducted by Dr. John Toussaint. The content went to material in his soon-forthcoming new book "Management on the Mend."

Five years after his debut book, On the Mend, showed how a large, cradle-to-grave health system revolutionized the way care is delivered, Dr. John Toussaint returns with news for healthcare leaders. There is a right way to go about such a transformation. And senior leaders need to be far more intimately involved.

While studying and assisting hundreds of organizations transitioning to lean healthcare, Dr. Toussaint witnessed many flaws and triumphs. Those organizations that win – creating better value for patients while removing waste and cost in the system – have senior managers that lead by example at the frontline of care. The best health systems have also discovered ways to engage everyone in solving problems and embracing change.

Management on the Mend is the result of years of investigations by Dr. Toussaint and dozens of healthcare organizations around the world. Using their collective experiences, he has built a model for lean transformations that work. This book describes the model, step by step, through people in 11 organizations who are doing the work. It is the story of many journeys and one conclusion: lean healthcare is not only possible, it is necessary.

As senior leaders look ahead to a future that includes radical changes in patient populations, the economics of healthcare, and patient expectations, everyone knows that health systems must be agile to survive. In order to thrive, they must be able to continuously improve. Here is the roadmap for that future.
During the post-presentation Q and A, someone asked about the much unloved CQM compliance. Dr. Toussaint responded by calling them what they really are -- "process indicators." e.g., what percentage of the time did a provider or clinical organization do x, y, or z. (A multitude of x's, y's, and z's actually.)

Whether doing so contributed to improved individual patient outcomes at the point of care or not.

The CMS web page spiel on CQMs:
Clinical Quality Measures Basics

Clinical quality measures, or CQMs, are tools that help measure and track the quality of health care services provided by eligible professionals, eligible hospitals and critical access hospitals (CAHs) within our health care system. These measures use data associated with providers’ ability to deliver high-quality care or relate to long term goals for quality health care. CQMs measure many aspects of patient care including:

  • health outcomes
  • clinical processes
  • patient safety
  • efficient use of health care resources
  • care coordination
  • patient engagements
  • population and public health
  • adherence to clinical guidelines
Measuring and reporting CQMs helps to ensure that our health care system is delivering effective, safe, efficient, patient-centered, equitable, and timely care.

To participate in the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs and receive an incentive payment, providers are required to submit CQM data from certified EHR technology.

To participate in the Medicare and Medicaid EHR Incentive Programs and receive an incentive payment, providers are required to submit CQM data from certified EHR technology.
Gotta love how they position "health outcomes" at the top of their bullet list. In the actual current CQM tables documents for both EP's (Eligible Providers - pdf) and EH's (Eligible Hospitals - pdf), however, you only see these:
  • Patient and Family Engagement
  • Patient Safety
  • Care Coordination
  • Population/Public Health
  • Efficient Use of Healthcare Resources
  • Clinical/Process Effectiveness
    "Clinical/Process Effectiveness?"

    Like Dr. Toussaint said, they're "process indicators," and the "effectiveness" part goes mostly to "what % of the time did you do these?" "How 'effective' is your compliance?" Calling them "quality measures" is a stretch (as they pertain to individual health outcomes).

    Just like "interoperability" is a misnomer (my "interoperababble" rant). Just like "Meaningful Use" is a misnomer -- what we really need is 'Effective Use," from the POV of the patient.

    The obvious intent and big-picture expectation is that by doing x, y, and z (and the voluminous additional "a" through "v" of CQMs), aggregate/statistical patient outcomes will eventually improve.

    Critics of CQMs (including critics of those process indicator cousins in the equally unloved PQRS initiative) complain that these compliance measures are actually detrimental to individual patient care, given the additional workflow burdens they impose -- that the paltry "incentive" funds are a net loser. Time and money and staff are finite; resources spent obsessively fiddling with process indicators compliance are resources not accorded the patient on the exam table or in the acute care bed.

    I remain conflicted about these things. A lot of smart, experienced people had their learned and lengthy say in the promulgation of such initiatives. And, while I sometimes irascibly call things like CQMs and other compliance measures "Quadrant Three" activities ("urgent, but not important"), even that is simplistic. ("All models are wrong; some models are useful.") Urgency and importance are continuous-scale, dynamic, contextual phenomena (that pesky "variation" that dogs all QI efforts). Nonetheless, we must always ask "urgent for whom?" "important to whom?" "meaningful to whom?"

    Moreover, inexorably, "he who pays the piper calls the tune." You don't want to do this stuff? Go Direct-Pay / Concierge (not that doing so will totally absolve you of all regulatory compliance, all your griping about "autonomy" notwithstanding).

    Dr. Toussaint was unfazed by the webinar participant's question. Implicit in his response was more or less "All the more reason to go Lean. You'll free up time for process compliance measures that aren't going away anytime soon."

    I'm looking forward to getting and reading (and reporting on) the new book. I was all over his first book "On the Mend" five years ago on this blog in my second post.

    I've closely studied everything he's put out ever since. Enter "Toussaint" in the upper left search cell atop this blog. You'll be scrolling for quite some time.

    I'll be covering the 2015 Lean Healthcare Summit in Dallas in early June, where Dr. Toussaint will be keynoting.

    From last week's webinar deck.

    "Courage to speak the truth." "Lead with humility." Music to these ears. "Talking Stick," anyone? "Just Culture," anyone? See also, specifically "Physician, Health Thy System."

    And, "Doctors and nurses in the trenches." to wit:
    Their hard-won, sophisticated, indispensable clinical skills aside, nurses and physicians are just people like the rest of us, people more or less beset by all of the frailties, foibles, insecurities, and neuroses that typically dog us all across the breadth of our lives. The fractious, high-stakes, irreducibly high cognitive burden organizational environments within which they must function are neither of their design nor under their control, and can (and unhappily do) exacerbate interpersonal difficulties that are counterproductive to optimal patient care. I call the syndrome "psychosocial toxicity," and have blogged about it at some length in prior posts.

    It's hardly confined to healthcare, to be sure, but organizational cultural dysfunction in healthcare is ultimately a patient safety issue. To the extent that we continue to view clinical co-workers through "transactional/instrumental," "superior/subordinate" lenses, our improvement efforts will be significantly hampered.


    Started a new blog, wherein I've collated for convenience a bunch of disparate posts on a different topic. See "Western U.S. Drought." I'll be adding resource links as I have time.

    More to come...

    Sunday, May 10, 2015

    On Mother's Day 2015

    I lost my Mom in late November 2011. Just thinking about her along with all the mothers today.

    Wednesday, May 6, 2015

    HELP: "Failing Electronic Health Records Program May Stand in Way of Precision Medicine"

    Interoperababble Update:

    Senators blast EHR program
    Poorly designed EHRs could put precision medicine initiative at risk
    WASHINGTON | May 6, 2015

    Until physicians have EHRs that can talk with one another, the Precision Medicine Initiative introduced by President Barack Obama could be in jeopardy, Sen. Lamar Alexander said Tuesday.

    "We've got to get these records to a place where the systems can talk to one another – that's called interoperability – and also where more doctors, particularly the smaller physicians' offices, want to adopt these systems, can afford the cost and can be confident that their investment will be of value," Alexander said...
    Yeah. Let's review some fundamentals.

    Consider the automobile analogy. A gasoline-powered car is "interoperable" in two senses.
    1. You can gas up anywhere in the nation "without special effort." Consider the fuel to be the car's standard "input data";
    2. The car's literal instrument "dashboard" (displaying operating parameters "output data") and physical manipulation tools (all comprising your "interface" and driving "UX") are essentially 'interchangeable" and consequently produce "interoperability" -- notwithstanding the wide variability of ancillary "features" depending on make and model (most of them pertaining to comfort and convenience).
    The foregoing are fully consistent with the above-cited core IEEE definitions.

    No such equivalent exists for EHRs. A "standard data dictionary" would go precisely to analogy point #1 above (and, implicitly, to point #2).

    I know, I know; not gonna happen. The barriers, though, are incumbent-deferential political, not technical. What I fear is that policymakers are simply going to "punt," and "define interoperability down," effectively striking the "without special effort" part. Additional workflow burden (along with "dirty data" propagation) may well become the norm, all the current swooning over FHIR® notwithstanding.

    Chronic Health IT (misnomer) "interoperability" concerns aside, is the "EHR program" really "failing"?

    apropos of that concern, last week, this hardcopy came in my mail:

    The featured article title and subtitle read
    The new politics of meaningful use
    As the federal program moves toward its final stage, some politicians are loudly questioning just what it's accomplished after five years and nearly $30 billion -- and many providers are voicing big concerns about what all these rules are really good for
    This issue is not available online (I've searched high and low for this article). I guess it was a HIMSS15 frisbee tabloid. The article, by Neil Versel, is simply excellent: candid, thorough, and fair. In fact, the entire issue is packed with great content.

    My tweets to the Senate HELP committee leadership.


    More to come...

    Tuesday, May 5, 2015

    "We've all been told that data are important..."

    A good one just up at "Healthcare Triage"
    "Frequent lab testing isn't very useful"

    @03:37: "We've all been told that data are important, and that more is the future."
    Below, two antecedent clips referred to in the foregoing:

    From my grad thesis:
    No diagnostic test or screening device is perfect. Errors of omission and commission occur...the definition of an accuracy rate can be done in a few different ways, and these are often confused in casual or uninformed communication...It is an important fact that predictive values do depend on overall prevalence rates...As the prevalence of a condition becomes rare, PPV [“Positive Predictive Value”] drops too, sometimes surprisingly so. For example, a test with sensitivity and specificity each equal to 99% is generally considered quite precise, relative to most diagnostic procedures. Yet for a condition with a not-so-rare prevalence of one per hundred, the odds on being affected [a “true positive”] given a positive test outcome are (.99/.01 x .01/.99) = 1 , i.e., among all positive results only 50% are truly affected! For a prevalence rate of one per thousand, the PPV is only about .10. These low numbers raise serious ethical and legal questions concerning action to be taken following positive test outcomes. - Finkelstein & Levin, Statistics for Lawyers.
    Color me firmly Bayesian (as well as a Chebychev-ist).

    More to come...

    Friday, May 1, 2015

    Upstream, downstream; what happens to health when there IS no more stream?

    I've had a pretty good recurrent go at the so-called "upstream" issues pertaining to health, e.g., here, and here, for example. The assertion is made that perhaps 90% of human health is atributable to "upstream" factors outside the clinical care delivery system: genetics (to the extent that they are still considered "outside of care delivery"), lifestyle factors, culture, poverty, pollution, and environmental factors more broadly.

    California, where I now live, is in the clutches of a severe and prolonged drought, one possibly lurching toward the catastrophic (statewide, we have less than a year's supply left, by some estimates).

    The image above is from a current New Yorker article "The Dying Sea."
    The Dying Sea
    What will California sacrifice to survive the drought?
    by Dana Goodyear

    ...In early April, the governor of California ordered the state to conserve a million and a half acre-feet of water in the next nine months, a drastic response to an intensifying four-year drought that has devastated small communities in the north, decimated groundwater supplies in the Central Valley, and made the cities fear for the future. To achieve this savings, Californians are starting to forgo some of the givens of life in modern America: long showers, frequent laundering, toilet-flushing, gardening, golf. It can be hard to visualize a quantity of water. An acre-foot is what it takes to cover an acre to the depth of twelve inches: some three hundred and twenty-five thousand gallons. A million acre-feet is about what the city of Los Angeles uses in two years. A million acre-feet, give or take, is also how much runs off to the Salton Sea each year from the farms of the surrounding Imperial Valley. Salty, spent, and full of selenium and phosphates, the excess water flows down to the sea, where, two hundred and thirty feet below sea level, it evaporates under a blistering sun...
    Indeed. In our household, we have reduced the frequency and length of our showering, reduced toilet flushing, -- "if it's yellow, let it mellow" -- and routinely capture the water that runs from the taps while awaiting needed hot water. The yard is slowly turning brown, as are the surrounding hills of Contra Costa County. The terms of our lease state that "drought conditions are not considered 'acts of God' for purposes of this agreement, and the lessee shall be responsible for maintaining the vegetation on the property in healthy condition through application of sufficient irrigation watering." While that provision is probably unenforceable in the wake of Governor Brown's recent statewide drought Executive Order, it would not surprise me were my landlord to dun us for "landscape damage" by withholding some of our $3,000 deposit were we to move.

    There are projections that residential water rates may soon triple. Critics of that approach bemoan the anti-regulatory "libertarian" socioeconomic "efficient markets" tactic of "rationing by price," given that the most affluent residents may simply shrug their shoulders and pay. After all, there are tee times to be had, Lexuses to be washed, and pools to remain filled.

    We've been considering buying a house, one closer to my wife's office, but -- current market prices and earthquake considerations aside -- predictions are emerging forecasting a severe housing market downturn by 2017 or so should the drought persist and worsen. At our ages, we don't have time to make a bad housing move and slide -- well,  "underwater." We're fortunately now among the very small minority of incipient retirees with money in the bank (including healthy IRAs), positive net worth, no debt, and positive cash flow. A mortgage gone south might well negate all of that.

    Portugal is starting to look pretty good. ;)

    LOL. Everyone should have my problems.

    Back to "The Dying Sea." and "upstream health issues."

    Between the needs of the city and the farmers sits the Salton Sea, which conservation will destroy. “The sea is the linchpin between Colorado River water and urban Southern California,” Michael Cohen, a senior research associate at the Pacific Institute, a water-policy think tank, says. Without the inflows, the sea, already shrinking, will recede dramatically, exposing miles of lake bed loaded with a hundred years’ worth of contaminants. Much of the wildlife will disappear—poisoned, starved, or driven off. The consequences for people around the region could be dire, too. Before irrigation, the valley was plagued by violent dust storms. With the water gone, the lake bed could emit as much as a hundred tons of fine, caustic dust a day, leading to respiratory illness in the healthy and representing an acute hazard for people with compromised immune systems. No one knows how far that dust can travel on the wind. Mary Nichols, the state’s top air-quality official, says, “The nightmare scenario is the pictures we’ve all seen of the Dust Bowl that contributed to the formation of California in the first place.”
    Last year, when Axel Rodriguez was four, he started coughing, a weird cough that sounded like a drum, deep and percussive, and scratchy, as though something inside his chest were trying to claw its way out. The cough would go away for a few days, only to come back stronger.

    Axel’s mother, Michelle Valdez, is twenty-four, with straight, dark waist-length hair and, behind studious glasses, eyes made up like Cleopatra. She told me recently that the problem had started in August, when the family moved into a new house, a pale-yellow stucco rental in Calexico, about thirty miles south of the Salton Sea. “The houses in this area catch a lot of dirt,” she said—from the cars on the heavily trafficked road nearby, from the grass that the landlord insists on mowing, from neighboring Mexicali, the polluted city of a million just across the Mexican border, and, most of all, she said, from the farm fields throughout the valley. “We are in a hole here,” she said. “All the nasty stuff just sits in it.”

    In September, the cough came and didn’t let up for two weeks. “Morning, afternoon, evening, it was cough, cough, cough,” Valdez said. Axel missed school, and his throat got so swollen that he could barely eat. He became a regular at the emergency room. Valdez and her husband, Antonio Marron, plied Axel with cough syrups, drops, Claritin—they’d been told he had allergies—and when those didn’t work they contemplated taking him to Mexicali for an herbal cure. “I was very desperate,” Valdez said. The medicines were also putting pressure on their finances. At the time, Marron was making about two hundred dollars a week working at Best Buy; Valdez stayed home, taking care of Axel and his sister, Ana, who is three.

    One windy, cold morning in September, Axel told his mother that his chest hurt and he couldn’t breathe. Marron covered Axel’s face with a blanket and took him to the emergency room again. His lungs were swollen and full of mucus. This time, Marron was told what the family had already begun to suspect: Axel had asthma. At the hospital that day, they met a nurse, Aide Fulton, who runs the Imperial Valley Child Asthma Program. She began to educate them about how to manage a chronic condition in an environment seemingly designed to aggravate it.

    Around the same time, Valdez had an episode herself, a tightening in her chest that felt as if someone were squeezing her with a belt. She went to the hospital, and learned that she’d lost a quarter of the capacity of her right lung. The diagnosis was chronic obstructed pulmonary disease, a condition associated with lifelong smoking. Valdez, who is not a smoker, has a genetic predisposition to C.O.P.D., but this was her first experience of it. She said, “I told my husband, ‘I never had an attack until you brought me here!’ ” Not that Marron has been spared. His current job, installing alarm systems, has him in and out of doors all day, and he recently came down with an infection in his lungs.

    The valley is eighty per cent Latino and mostly poor. It also has the state’s highest rate of asthma-related hospitalization for children. The Imperial Valley Child Asthma Program is the only free asthma-education program in the county, and it operates on an average budget of a hundred and forty-six thousand dollars a year. Fulton told me that in the past few years there has been an increase in referrals from the towns adjacent to the Salton Sea. (At one school, in the seaside town of Calipatria, sixty-five children are ill.) “We already have an unmet need,” she said. “What are we going to do when the Salton Sea dries out?” She is thinking of her clients, and she is thinking of herself. Her husband has asthma, her oldest daughter has asthma, and, after twenty-five years in the valley, she has it, too. “The issue of the sea is a bomb,” she said. “It’s a monster coming to get us all.”...
    I've been following southwestern drought issues for years. Dozens of millions of us in the region are at increasing risk across a variety of fronts -- including health.

    Started posting about the drought when I lived in Vegas. See my post "Las Vegas: The next Anasazi Ruin?"

    I recently updated the Excel sheet I've long kept to track the inexorably declining water levels in Lake Mead.

    The all-time high water level of Lake Mead came in July 1983 at 1,223 feet above sea level. It has dropped some 145 vertical feet since then. That is a lot of water. The Colorado River Rocky Mountains watershed source snowpack has been steadily declining for decades. The Sierra Nevada watershed snowpack, source of California's hydrological cycle water, hit an all-time low this year.




    It's a southwestern US crisis writ large, but, California, with its 39 million population and huge economy is Ground Zero.
    California snowpack survey canceled: 'Drought is severe'

    State water officials had planned to make the trek back to the Sierra Nevada in the coming days to conduct their snowpack measurement Friday.

    But Thursday they announced they wouldn’t bother. For the second consecutive month, there won’t be any snow to measure.

    “This is just another piece of information in a series of increasingly dismal findings,” said Department of Water Resources spokesman Doug Carlson. “It nails down that the drought is severe – maybe as severe as any in our history.”

    The latest disappointing news comes a month after Gov. Jerry Brown stood on a field at Phillips Station – where this week’s manual measurement was to occur – and announced California’s first mandatory statewide water use cuts to combat the ongoing drought. It was the only April 1 survey since 1941 without any snow.

    Officials say conditions have gotten even worse since then. 

    Though the manual measurement east of Sacramento often provides a backdrop for media coverage, the state uses electronic sensors up and down the Sierra to measure the water content of the snow. Snowpack accounts for about 30% of the state's water supply when it melts in the late spring and summer and replenishes reservoirs.

    On April 1, statewide measurements showed that the snowpack’s water content was just 5% normal for that date, the lowest in records going back to 1950. Thursday’s readings indicate the snowpack’s water content is half an inch or about 3% of normal for this time period...
    The Political Opportunity of a Drought
    California Governor Jerry Brown's push for massive cuts to greenhouse-gas emissions turns a crisis into an opportunity to legislate.

    “I can tell you, from California, climate change is not a hoax," California Governor Jerry Brown told Martha Raddatz earlier this month. "We’re dealing with it, and it’s damn serious.”

    The key word missing from Brown's widely disseminated Sunday morning show soundbite was "drought." While his appearance on This Week was ostensibly meant to be about California's worst-ever drought on record, he parlayed the segment into a broader conversation about climate change.

    With alarming reports about water shortages and the state's warmest-ever winter, Brown has successfully folded the issue of the drought into a broader slate of ambitious environmental reforms. But even as he puts together billion-dollar relief packages and labels climate-change opposition "immoral," he does so without explicitly linking the drought to global warming.

    Even though reports from agencies like the National Oceanic and Atmospheric Administration have refrained from laying the historic drought at the foot of global warming, Brown has used the general sense of crisis to move ahead on climate-change issues. Brown's efforts are not only affecting California, but the rest of the country as well.

    In an executive order on Wednesday, Brown called for the most aggressive cuts to carbon emissions in North America. The order establishes that California "must cut the pollutants to 40% below 1990 levels by the year 2030, more than a decade after he leaves office," the Los Angeles Times reported.

    “California is taking the most aggressive steps to deal with pollution and the effects of climate change,” he told a roaring crowd at a climate-change conference in California.

    As Bloomberg notes, in order to achieve this new target, the state will have to "require utilities to get more electricity from low-pollution sources, compel industries to cut smokestack emissions further and encourage greater numbers of cleaner cars on roads." One likely upshot is that California's efforts are going to impact businesses and utilities in other states, making the Golden State's policy a meaningful factor elsewhere in the country...
    How will all of this affect human health? What will happen if/when the stream runs dry? We may begin finding out in relatively short order.

    "What will California sacrifice to survive the drought?"
    Poor people? Their livelihoods, their health? Remember, When it comes to health, your zip code matters more than your genetic code.

    Another adverse health consideration linked to protracted drought? Elevated rates of  severe, widespread wildfires and the environmental (and respiratory) damage they cause. Below, estimate 2015 wildfire rate increases.

    California's Drought Has Killed Over 12 Million Trees In The Last Year

    California's historic drought is having a major impact on the state's forests.

    According to an aerial survey conducted last month by the U.S. Forest Service, approximately 12 million forest trees have died in Southern California and the southern Sierra Nevada mountains over the last year. The report credits unusually high temperatures, a diminished snowpack and a severe lack of rainfall with drying up the trees, leaving the region susceptible to forest fires.

    Of the more than 4.2 million acres surveyed in Southern California, researchers found 164,000 acres with high tree mortality. They found approximately 2 million trees had died over the last year.

    In the southern Sierras, researchers found over 10 million perished trees in 4.1 million acres. There, mortality is "widespread and severe" in the foothills among ponderosa, gray pine, blue oak and live oak trees.

    Jeffrey Moore, the acting aerial survey program manager for the region, told the Los Angeles Times he expects the mass tree mortality to continue throughout the summer...

    California drought: Can railroads come to the rescue?
    CNBC: Jeff Daniels, @jeffdanielsca

    As California's four-year drought worsens and water supplies dwindle in the state, an old technology—railroads—could play a role in alleviating some water shortages.

    "We certainly have that capability today," said Mike Trevino, a spokesman for privately held BNSF Railway, which operates one of the largest freight railroad networks in North America. "We carry chlorine, for example. We carry liquefied commodities."
    Experts say the East Coast's plentiful water could cost cents per gallon to Californians and provide a stable, potable water supply for small communities. Obstacles include identifying a state willing to share some of its water, and securing the construction funds for key infrastructure work, including terminals that can handle water...
     Yeah. I was on that years ago.

    From the CNBC article:
    "Experts say the East Coast's plentiful water could cost cents per gallon to Californians and provide a stable, potable water supply for small communities."
    That is simply not true, at least in the sense that the "cents per gallon" assertion is meant to imply that the cost would be trivial, consequently making large-scale rail water transport economically viable. An acre-foot of water weighs 1390 tons. This acre-foot would fill 10 of the largest DOT-100 rail tank cars, each of which has a tare (empty) weight of 62 tons. At the usually cited 5 cents per ton-mile rail freight transport cost, you're looking at $100 per mile per acre-foot. Forget the "east coast." Assume, for the sake of illustrative argument, that you could drop a "straw" into Lake Superior (with its massive 2,903 cubic miles of water) -- at, say, Duluth -- pump it into rail tank cars and transport it ~2,000 miles to parched California. (Those living around Lake Superior would never miss it, but they'd invariably Primal-Scream howl that the exporters were going to drain the lake dry for the benefit of San Joaquin Valley walnut growers and other SoCal billionaires.)

    I get 62 cents per gallon just for the transport.
    Even were we to, say, alternatively route such hypothesized RR water tankers to the headwaters of the Colorado River Watershed northwest of Denver, up just past the Continental Divide (about 1,100 miles from Chicago or Duluth) to then discharge the water into the headwaters streams for subsequent gravity-borne southwestward downhill flow, you're still talking perhaps 32 cents per gallon (and you'd lose a bunch of it to evaporation across the rest of the journey).
    Other issues: the states of Michigan, Wisconsin, and Minnesota, as well as (most importantly) the nation of Canada are unlikely to say "yeah, sure, by all means, help yourselves." Even moving the "straw" loading terminal south to Lake Michigan might not take Canada out of the policy picture (given the zero-sum fungibility of water), and would add ticked-off locals residing in Illinois and Indiana to the "not-in-my-backyard" states-level fray.

    Moreover, there are only an estimated 300,000 DOT-111 rail tank cars in the north American freight market standing stock inventory, hauling every imaginable liquid commodity. While the image of a "rolling water pipeline" might present an alluring visual, it seems impractical to me at requisite scale. California alone consumes an estimated 8.5 million acre-feet of water per year. 10 rail cars per acre-foot? Do the math.

    One tough problem.


    Interesting "Seawater Desalination" post by the San Diego County Water Authority.

    Outside-the-Box Afterthought on Desal

    Two inherent liabilities of desal are [1] intense power consumption and [2] the brine effluent. Related to the first concern are the environmental impact of burning dirty extractive fuels to generate the electricity for the reverse osmosis process.

    OK, rotate two of our ten nuke carriers in and out of station offshore (one off the northern coast, one off the south) to provide desal energy (far enough offshore to effectively mitigate the brine concentration problem on land), pumping the water into conventional tankers for subsequent distribution. Typical long range tanker ship capacity is about 20 million gallons (61 acre-feet).

    The Navy could still do their routine training ops while powering desal production.

    Gets around the problems of dirty hydrocarbon fuels emissions, prohibitive civilian nuke plant licensing obstacles, and ground-based nuke plant California earthquake concerns.

    I know. Crazy.

    Whatever. There are no cheap solutions. Cheaper than the socioeconomic (and broader misery index) upshot of a new dustbowl, though.



    More to come...