Search the KHIT Blog

Wednesday, August 29, 2012

Single Source of Truth

The Consolidated CDA implementation guide defines nine different types of commonly used CDA documents, including:
  • Continuity of Care Document
  • Consultation Notes
  • Discharge Summary
  • Imaging Integration, and DICOM Diagnostic Imaging Reports
  • History and Physical
  • Operative Note
  • Progress Note
  • Procedure Note
  • Unstructured Documents
Each of these nine documents has a document template defined in The Consolidated CDA implementation guide, which will now be the single source of truth for implementing these CDA documents.
___

Dr. Farzad Mostashari / National Coordinator for Health Information Technology
Common Standards and Implementation Specifications for Electronic Exchange of Information: The Meaningful Use Stage 2 final rules define a common dataset for all summary of care records, including an impressive array of structured and coded data to be formatted uniformly and sent securely during transitions of care, upon discharge, and to be shared with the patient themselves. These include:
  • Patient name and demographic information including preferred language (ISO 639-2 alpha-3), sex, race/ethnicity (OMB Ethnicity) and date of birth
  • Vital signs including height, weight, blood pressure, and smoking status (SNOMED CT)
  • Encounter diagnosis (SNOMED CT or ICD-10-CM)
  • Procedures (SNOMED CT)
  • Medications (RxNorm) and medication allergies (RxNorm)
  • Laboratory test results (LOINC)
  • Immunizations (CVX)
  • Functional status including activities of daily living, cognitive and disability status
  • Care plan field including goals and instructions
  • Care team including primary care provider of record
  • Reason for referral and referring provider’s name and office contact information (for providers)
  • Discharge instructions (for hospitals)
In addition, there are a host of detailed standards and implementation specifications for a number of other transactions including quality reporting, laboratory results, electronic prescribing, immunizations, cancer registries, and syndromic surveillance (see below for a detailed list).

What does this mean? It means that we are able to break down barriers to the electronic exchange of information and decrease the cost and complexity of building interfaces between different systems while ensuring providers with certified electronic health record (EHR) technology have the tools in place to share, understand, and incorporate critical patient information. It also means that providers can improve workflow and dig deeper into the data. Certified EHR technology must be able to support identity reconciliation—matching the right record to the right person—and will give doctors the tools to reconcile a new document with the information already on file, for instance by incorporating medications and problems identified by another provider into a patient’s record,  thus creating a single source of truth. [emphases mine - BG]
___

"In Information Systems design and theory, as instantiated at the Enterprise Level, Single Source Of Truth (SSOT) refers to the practice of structuring information models and associated schemata such that every data element is stored exactly once (e.g., in no more than a single row of a single table). Any possible linkages to this data element (possibly in other areas of the relational schema or even in distant federated databases) are by reference only. Thus, when any such data element is updated, this update propagates to the enterprise at large, without the possibility of a duplicate value somewhere in the distant enterprise not being updated (because there would be no duplicate values that needed updating)."

"ADT 44"

From an email I found online today, embedded in a RHIO pdf document (it's an HL7 messaging error thing):

ADT A44 (Move account information — patient account number)
The intention of these messages is to move an account number, which may or may not have associated documents, from one patient to another.
These types of messages could result from inadvertently selecting "John Doe SR", as opposed to "John Doe JR", for example. Whereas "John Doe JR" is the correct patient. Once the patient is changed/corrected, the resulting A44 message initiates a move of all the documents associated with this account to the corrected patient ID.
In this example, the correct Patient ID = "John Doe JR", and the Prior Patient ID = "John Doe SR".
Investigation
Historic data analysis for all A44 messages received for CFRHIO:
  • Total A44 messages received thru 7/31 = 6,417
  • Total "Prior Patient IDs" with associated documents = 368
  • FHO = 213
  • OH = 155
  • Total associated documents accessed = 0
Action Plan
A44 message processing moving forward:
  • Short-Term: (Complete) Manually disable the viewing of documents for the "Prior Patient ID" or MRG-1, as specified in the A44 messages.
  • Medium-Term: (In Progress) Manually process the account move for all A44 messages, both historic and new inbound.
  • Long-Term: (Investigating) Implement non-manual process to properly process new A44 messages.
Thank you for your understanding.
Interesting. Maybe it should have been subjected to the RDBMS equivalent of "rigorability." I'm not at liberty to discuss what specifically brought this issue to my attention, but, it points up the widespread chronic data linkage problem posed by a continuing lack of a unique (no-dupes, no nuls), secure national patient identifier (Like, you know, a HIC number writ large).

NO DUPES, NO NULS, 
AUTHENTICATED PRIMARY KEY IDENTIFIER


What shall be the reliable identifier? to wit, the much unloved SSN:
What happens to the money assigned to people using false identities, or names matched with the wrong Social Security Number (SSN), or newlyweds who forgot to register their name changes with the Social Security administration?

The answer lies in a little-known aspect of the Social Security behemoth known as the Earnings Suspense File (ESF).
A necessarily dynamic probabilistic combination of Last/First/DoB/Gender/SSN?

I worked for a number of years in the Risk Management Department of a credit card bank. Among my duties was ongoing portfolio modeling, management, and fraud monitoring, all which entailed a good bit of data mining, repeatedly running SAS code against millions of customer account records.

Wherein, as I wrote in 2002,
...our department has the endless and difficult task of trying to statistically separate the “goods” from the “bads” using data mining technology and modeling methods such as factor analysis, cluster analysis, general linear and logistic regression, CART analysis (Classification and Regression Tree) and related techniques.

Curiously, our youngest cardholder is 3.7 years of age (notwithstanding that the minimum contractual age is 18), the oldest 147. We have customers ostensibly earning $100,000 per month—odd, given that the median monthly (unverified self-reported) income is approximately $1,700 in our active portfolio.
Yeah. Mistakes. We spend a ton of time trying to clean up such exasperating and seemingly intractable errors. Beyond that, for example, we undertake a new in-house credit score modeling study and immediately find that roughly 4% of the account IDs we send to the credit bureau cannot be merged with their data (via Social Security numbers or name/address/phone links).

I guess we’re supposed to be comfortable with the remaining data because they matched up -- and for the most part look plausible. Notwithstanding that nearly everyone has their pet stories about credit bureau errors that gave them heartburn or worse...

In addition to credit risk modeling, an ongoing portion of my work involves cardholder transaction analysis and fraud detection. Here again the data quality problems are legion, often going beyond the usual keystroke data processing errors that plague all businesses. Individual point-of-sale events are sometimes posted multiple times, given the holes in the various external and internal data processing systems that fail to block exact dupes. Additionally, all customer purchase and cash advance transactions are tagged by the merchant processing vendor with a 4-digit “SIC code” (Standard Industrial Classification) categorizing the type of sale. These are routinely and persistently miscoded, often laughably. A car rental event might come back to us with a SIC code for “3532- Mining Machinery and Equipment”; booze purchases at state-run liquor stores are sometimes tagged “9311- Taxation and Monetary Policy”; a mundane convenience store purchase in the U.K. is seen as “9711- National Security”, and so forth.

Interestingly, we recently underwent training regarding our responsibilities pursuant to the Treasury Department’s FinCEN (Financial Crimes Enforcement Network) SAR program (Suspicious Activity Reports). The trainer made repeated soothing references to our blanket indemnification under this system, noting approvingly that we are not even required to substantiate a “good faith effort” in filing a SAR. In other words, we could file egregiously incorrect information that could cause an innocent customer a lot of grief, and we can’t be sued.

 He accepted uncritically that this was a necessary and good idea.
I spent inordinate episodic FoxPro and SAS coding time (re) "cleaning the data," cross-referencing and correcting thousands of bad entries (Last, First, DOB, Gender, SSN) -- chiefly and most important among them bad "Socials" (the result of the legion input keystroke errors).

Then, we hired this enthusiastic but in some ways hapless H-1B crew to establish an Oracle data warehouse, and they thereupon had the brilliant idea to store SSNs as integers in lieu of character strings. So, were a cardholder Social to have been something like "012-34-5678," you now gotta write logic that will take #12345678 and convert/parse/substring re-concatenate it back to char(11) "012-34-5678" for proper ASCII collation and ease of view.

Now, in those circumstances of crapped-up but unremediated primary keys maybe the worst case upshot (in addition to my recurrent cube rants) would have been someone being denied a credit line increase or APR decrease.

In the case of a HL7 "ADT 44," on the other hand, someone could get the wrong meds dose and die from it.**

Close Only Counts in Horseshoes and Hand Grenades.
** and, yes, to be fair, I know that "Last+First+DoB+Gender+Social" would itself be imperfect (and highly variable; See Latanya Sweeney's work). Maybe someday we'll have Genes+Retinal Scan ID proxies. Maybe. But, we can do way better than the current cludgy "Master Patient Index" databases currently in use.
apropos,
MY LIGHT BEDTIME READING FOR TONIGHT


CMMI® for Development, Version 1.3
CMMI-DEV, V1.3 CMMI Product Team
Improving processes for developing better products and services
November 2010 TECHNICAL REPORT
CMU/SEI-2010-TR-033
ESC-TR-2010-033
Software Engineering Process Management Program
Unlimited distribution subject to the copyright.
http://www.sei.cmu.edu
and


Close to another 1,200 pages of reading Now, with respect to the former, my interest has been piqued by this "Agile" thing. The Fad of The Year?


From CMMI V1.3:
All of the notes begin with the words, “In Agile environments” and are in example boxes to help you to easily recognize them and remind you that these notes are examples of how to interpret practices and therefore are neither necessary nor sufficient for implementing the process area.

Multiple Agile approaches exist. The phrases “Agile environment” and “Agile method” are shorthand for any development or management approach that adheres to the Manifesto for Agile Development [Beck 2001].

Such approaches are characterized by the following:
  • Direct involvement of the customer in product development
  • Use of multiple development iterations to learn about and evolve the product
  • Customer willingness to share in the responsibility for decisions and risk
Many development and management approaches can share one or more of these characteristics and yet not be called “Agile.” For example, some teams are arguably “Agile” even though the term Agile is not used. Even if you are not using an Agile approach, you might still find value in these notes. PDF pg 70
___
  • In Agile environments, configuration management (CM) is important because of the need to support frequent change, frequent builds (typically daily), multiple baselines, and multiple CM supported workspaces (e.g., for individuals, teams, and even for pair-programming). Agile teams may get bogged down if the organization doesn’t: 1) automate CM (e.g., build scripts, status accounting, integrity checking) and 2) implement CM as a single set of standard services. At its start, an Agile team should identify the individual who will be responsible to ensure CM is implemented correctly. At the start of each iteration, CM support needs are re-confirmed. CM is carefully integrated into the rhythms of each team with a focus on minimizing team distraction to get the job done. PDF pg 151
  • In Agile environments, product integration is a frequent, often daily, activity. For example, for software, working code is continuously added to the code base in a process called ―continuous integration.‖ In addition to addressing continuous integration, the product integration strategy can address how supplier supplied components will be incorporated, how functionality will be built (in layers vs. ―vertical slices‖), and when to ―refactor.‖ The strategy should be established early in the project and be revised to reflect evolving and emerging component interfaces, external feeds, data exchange, and application program interfaces. PDF pg 270
  • In Agile environments, the sustained involvement of customer and potential end users in the project’s product development activities can be crucial to project success; thus, customer and end-user involvement in project activities should be monitored. PDF pg 287
  • For product lines, there are multiple sets of work activities that would benefit from the practices of this process area. These work activities include the creation and maintenance of the core assets, developing products to be built using the core assets, and orchestrating the overall product line effort to support and coordinate the operations of the inter-related work groups and their activities. In Agile environments, performing incremental development involves planning, monitoring, controlling, and re-planning more frequently than in more traditional development environments. While a high-level plan for the overall project or work effort is typically established, teams will estimate, plan, and carry out the actual work an increment or iteration at a time. Teams typically do not forecast beyond what is known about the project or iteration, except for anticipating risks, major events, and large-scale influences and constraints. Estimates reflect iteration and team specific factors that influence the time, effort, resources, and risks to accomplish the iteration. Teams plan, monitor, and adjust plans during each iteration as often as it takes (e.g., daily). Commitments to plans are demonstrated when tasks are assigned and accepted during iteration planning, user stories are elaborated or estimated, and iterations are populated with tasks from a maintained backlog of work. PDF pg 294
  • In Agile environments, teams tend to focus on immediate needs of the iteration rather than on longer term and broader organizational needs. To ensure that objective evaluations are perceived to have value and are efficient, discuss the following early: (1) how objective evaluations are to be done, (2) which processes and work products will be evaluated, (3) how results of evaluations will be integrated into the team’s rhythms (e.g., as part of daily meetings, checklists, peer reviews, tools, continuous integration, retrospectives). PDF pg 315
  • In Agile environments, customer needs and ideas are iteratively elicited, elaborated, analyzed, and validated. Requirements are documented in forms such as user stories, scenarios, use cases, product backlogs, and the results of iterations (working code in the case of software). Which requirements will be addressed in a given iteration is driven by an assessment of risk and by the priorities associated with what is left on the product backlog. What details of requirements (and other artifacts) to document is driven by the need for coordination (among team members, teams, and later iterations) and the risk of losing what was learned. When the customer is on the team, there can still be a need for separate customer and product documentation to allow multiple solutions to be explored. As the solution emerges, responsibilities for derived requirements are allocated to the appropriate teams. PDF pg 339
  • In Agile environments, requirements are communicated and tracked through mechanisms such as product backlogs, story cards, and screen mock-ups. Commitments to requirements are either made collectively by the team or an empowered team leader. Work assignments are regularly (e.g., daily, weekly) adjusted based on progress made and as an improved understanding of the requirements and solution emerge. Traceability and consistency across requirements and work products is addressed through the mechanisms already mentioned as well as during start-of-iteration or end-of-iteration activities such as ―retrospectives ‖ and ―demo days. ‖ PDF pg 354
  • In Agile environments, some risk management activities are inherently embedded in the Agile method used. For example, some technical risks can be addressed by encouraging experimentation (early ―failures ‖) or by executing a ―spike ‖ outside of the routine iteration. However, the Risk Management process area encourages a more systematic approach to managing risks, both technical and non-technical. Such an approach can be integrated into Agile’s typical iteration and meeting rhythms; more specifically, during iteration planning, task estimating, and acceptance of tasks. PDF pg 362
  • In Agile environments, the focus is on early solution exploration. By making the selection and tradeoff decisions more explicit, the Technical Solution process area helps improve the quality of those decisions, both individually and over time. Solutions can be defined in terms of functions, feature sets, releases, or any other components that facilitate product development. When someone other than the team will be working on the product in the future, release information, maintenance logs, and other data are typically included with the installed product. To support future product updates, rationale (for trade-offs, interfaces, and purchased parts) is captured so that why the product exists can be better understood. If there is low risk in the selected solution, the need to formally capture decisions is significantly reduced. PDF pg 386
  • In Agile environments, because of customer involvement and frequent releases, verification and validation mutually support each other. For example, a defect can cause a prototype or early release to fail validation prematurely. Conversely, early and continuous validation helps ensure verification is applied to the right product. The Verification and Validation process areas help ensure a systematic approach to selecting the work products to be reviewed and tested, the methods and environments to be used, and the interfaces to be managed, which help ensure that defects are identified and addressed early. The more complex the product, the more systematic the approach needs to be to ensure compatibility among requirements and solutions, and consistency with how the product will be used. PDF pg 414

I have much yet to learn. In particular, a simple explanation of the foregoing graphic as it pertains to an effective methodology for HIT development. Dubiety endures, and suspicion of bamboozlement reeks more broadly.

concern, to wit:
Because the companies using QFD [Quality Function Deployment] are already fairly sophisticated in their approaches to quality control, the apparent success of QFD as a software quality approach may be misleading. 

QFD is a very formal, structured group activity involving clients and product development personnel. QFD is sometimes called “the house of quality” because one of the main kinds of planning matrices resembles the peaked roof of a house. 

In the course of the QFD sessions, the users’ quality criteria are exhaustively enumerated and defined. Then the product’s quality response to those requirements is carefully planned so that all of the quality criteria are implemented or accommodated. 

For the kinds of software where client quality concerns can be enumerated and where developers can meet and have serious discussions about quality, QFD appears to work very well: embedded applications, medical devices, switching systems, manufacturing support systems, fuel-injection software controls, weapons systems, and the like. 

Also, QFD requires development and QA personnel who know a lot about quality and its implications. QFD is not a “quick and dirty” approach that works well using short-cuts and a careless manner. This is why QFD and Agile are cited as being “antagonistic.” 

The QFD software projects that we have examined have significantly lower rates of creeping requirements and also much lower than average volumes of both requirements and design defects than U.S. norms. However, the kinds of software projects that use QFD typically do not have highly volatile requirements.

Jones, Capers; Bonsignour, Olivier (2011-07-19). The Economics of Software Quality (Kindle Locations 3299-3311). Pearson Education (USA). Kindle Edition.
All very interesting.
QFD requires development and QA personnel who know a lot about quality and its implications. QFD is not a “quick and dirty” approach that works well using short-cuts and a careless manner. This is why QFD and Agile are cited as being “antagonistic.” 

UPDATE

Ran across an interesting website and blog post:

Leadership Skills: Building Collaborative Teams
Work teams can be very effective. They can also be a disaster, as anyone with even a passing knowledge of organizational dynamics understands. In today’s world of instant information, many of these impediments to real team performance are being overcome, while new challenges are emerging. New teams must be highly communicative, collaborative, mutually supportive, multitalented, and quick to respond, often without having a complete picture of the “facts.” New teams must be able to act with relative autonomy, demanding higher levels of accountability, unparalleled access to information, and commensurate authority. Team leadership can shift as demands for expertise change, although accountability remains with the titular team leader. The new team leader, therefore, must be both highly talented and politically savvy to survive and thrive as organizations adapt to new models. He or she must either have the stature or authority to withstand great pressure to avoid producing the “same old stuff,” which is tantamount to team failure.

In organizations with traditional structures and loyalties, teams are easily compromised by the often divergent pull from multiple constituencies that provide lip service to team success while providing minimum support or even actively working to sabotage team efforts. Teams that cannot pull themselves loose through the efforts of a strong, grounded leader or who have a patron high up in the organization often are teams in name only...

I forwarded this around our shop. We suffer from having too many "teams," all frequently busily doing "Work About The Work About the Work." A rather common affliction, unfortunately (including ASQ Divisions).

ERRATUM...


OK...
AGILE LEAN SIX SIGMA QFD RAPID-CYCLE PDSA CQI TQM!



One of my long favorite philosophers is the late Alan Watts, who once wryly observed something to the effect that "a problem with Christianity is that people have replaced the religion of Jesus with a religion about Jesus."

"Six Sigma" accords us a couple of lovely metaphors: [1] the boundary within plus or minus six standard deviations around a process average, assuming a perfectly Gaussian ("bell curve") dispersion, and [2] all those cool martial-arts green and black "Belts."

A "religion" "about."

In the software realm, "Agile" ups the allusive ante.


Will this be a hot new business line of professional certifications? Lordy. Agile Grasshoppers to Agile Samurai? And, to further jumble the metaphors, will they be donning rubgy attire for "scrums" and track gear for "sprints"?

Back to Jones and Bonsignour:
The phrase Six Sigma, which originated at Motorola, refers to defect densities of 3.4 “opportunities” for defects per million chances. The approach was originally used for complex manufactured devices such as cell phones but has expanded to scores of manufactured products and software, too. 

The original Six Sigma approach was formal and somewhat expensive to deploy. In recent years subsets and alternatives have been developed such as “lean Six Sigma” and “Six Sigma for software” and “design for Six Sigma.” 

Because the Six Sigma definition is hard to visualize for software, an alternative approach would be to achieve a cumulative defect removal efficiency rate of 99.999999% prior to delivery of a product. This method is not what Motorola uses, but it helps to clarify what would have to be done to achieve Six Sigma results for software projects. 

Given that the current U.S. average for software defect removal efficiency is only about 85%, and quite a few software products are even below 70%, it could be interpreted that software producers have some serious catching up to do. Even top-ranked software projects in the best companies do not go beyond about 98% in cumulative defect removal efficiency.

Jones, Capers; Bonsignour, Olivier. The Economics of Software Quality (Kindle Locations 3381-3384).
"[D]efect densities of 3.4 “opportunities” for defects per million chances." Yeah, as a process average, one assuming a perfectly Gaussian distribution,


which, of course, exists only on college chalkboards and in textbooks (and, of course, on Wall Street -- and, we see where that got them).


Color me presumptively Chebyshev-ista.


"Chance is lumpy." - Abelson's Laws

Consequently, your outer bound is 2.78% defect rate at 6 sigma (percent, not per million) under Chebyshev.

It gets worse when you go all 3+ dimensional. Think about it.


I guess here's the cut-to-the-chase point (in addition to and beyond the Watts analogy): I can pretty quickly convey the essentials of "Lean" to the average assemblage of high school- educated clinic front office staff. Priesthoods of belt-laden QI In-Tongue speakers, well, on the other hand, makes for a nice market in books and webinars.
___

REMINDER


Money on the table.
___

INTERESTING NEWS


Demand for meaningful use (MU) assistance has exploded, increasing competition between third-party consulting firms--most of whom are excelling in MU-related work.
Not one word about RECs.

We just can't get any love these days. In my email inbox this morning:


MORE NEWS

 Which components of health IT will drive financial value?
A framework that describes the ability of specific health information exchange (HIE) and EHR functionalities to drive financial savings could help efforts to develop meaningful use measures and measure the financial impact of health IT, according to research published in the August issue of the American Journal of Managed Care.

“Previous work in this area has largely modeled the financial effect of whole health IT applications, assuming that the effects of those applications were similar across different contexts,” wrote lead researcher Lisa M. Kern, MD, MPH, of Weill Cornell Medical College in New York City, and colleagues. “This assumption may not be true because health IT is an inherently heterogeneous intervention. EHRs and HIE are themselves applications composed of functionalities that are variably implemented, configured and used.”...  
Courtesy of Cardiovascular Business. Full Journal article here.

___

BTW - 
"We support technology enhancements for medical health records and data systems while affirming patient privacy and ownership of health information."

- GOP 2012 Platform. That's it. The only reference to Health IT.
___

AUGUST 31 UPDATE:
YET ANOTHER "HEALTHCARE TRANSFORMATION" INSTITUTE


I got a LinkedIn email heads-up about these people today. Pretty interesting report, actually (in light of what I've perused thus far).

Like a person suffering from a debilitating disease, healthcare delivery in the United States is ailing. The U.S. spends significantly more per capita and a higher percentage of GDP on healthcare than other developed nations, yet our patient outcomes (e.g., mortality, safety, access to medical care) are disparate and inconsistent. Moreover, the rapidly rising costs of healthcare delivery are making medical care increasingly unaffordable to the average citizen and threaten our national financial viability.

How did we get here? Although unhealthy lifestyles and the growing and aging population are undoubtedly contributing to the rise in healthcare costs, two key factors must not be underestimated: a) advances in medical technology and b) powerful system incentives that inadvertently advance unchecked utilization throughout the healthcare delivery system.

So what can we do?

Where do we start? Read some Dr. John Toussaint (e.g., see my August 4th post), along with this report.

Read on.
___

TROUBLEMAKER in the TWITTERVERSE


Details shortly.

SEPT 2nd REC ASS'N AMBER ALERT

 ___

More to come...

Saturday, August 25, 2012

Stage 2 Final Rule and 2014 CEHRT Final Rule

 

So, the day after the new FRs were released we had this big breathless national REC webinar conference call. The slide deck can be downloaded here.


74 pages into the 101 page deck they get to the EHR Cert Rule.


CHIME chimes in.

 
 “We commend the Centers for Medicare & Medicaid Services and the Office of the National Coordinator for Health IT for seeing the wisdom and practicality of heeding many of CHIME’s recommendations, filed during the spring public comment period. By allowing providers to demonstrate Meaningful Use through a 90-day EHR reporting period for 2014, government rule-makers have ensured greater levels of program success. And by including additional measures to the menu set, providers have a better chance of receiving funds for meeting Stage 2.

“However, we also recognize that these points are conciliatory and that many details may need further clarification. The final rule still puts providers at risk of not demonstrating meaningful use based on measures that are outside their control, such as requiring 5 percent of patients to view, download or transmit their health information during a 3-month period. Some areas of clarification include some of the exclusionary language as well as nuances around health information exchange provisions, clinical quality measures and accessing images through a certified EHR.

“CHIME will continue to delve into this sizable and weighty effort, including the technical specifications and certification criteria”
I'm sure I'll address the crux of the Stage 2 criteria stuff (petabytes are being posted by every HIT pundit in the nation as we speak), but at the outset I am way more interested in what they've issued with respect to HIT "usability." Technical "capabilities" are necessary and fine, but if navigating them adds appreciable workflow burden to already stressed clinicians and support staffs, it gets increasingly difficult to make the ROI case.

The word "usability" pops up 32 times in the CEHRT FR, beginning at page 90 (of 474 pages).
Safety-enhanced design
In the Proposed Rule, we provided an overview of the ISO definition of usability as “[t]he extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.” We outlined that EHR technology certification could introduce some improvements in usability, which we believed would enhance both the safety and efficiency of CEHRT. In the Proposed Rule, we also reviewed the November 2011 Institute of Medicine (IOM) report titled, “Health IT and Patient Safety: Building Safe Systems for Better Care,” in which the usability of EHR technology and quality management was often referenced. The IOM noted that “[w]hile many vendors already have some types of quality management principles and processes in place, not all vendors do and to what standard they are held is unknown.”
To repeat:
The IOM noted that “[w]hile many vendors already have some types of quality management principles and processes in place, not all vendors do and to what standard they are held is unknown.”

ASQ Software Quality Division? Hel-LO? Is ANYONE listening.
Asked and Answered. But, they're up on Twitter now,


so, we'll see what happens after I "tweet" them. What's the latest on the Division website (as of today)?


I have to confess to my perplexity with respect to the ongoing HIT inattention. ASQ generally seems to have come to be increasingly sclerotic in many quarters. I will continue to call them out. It's all in good faith.
___

Continuing with the CEHRT FR:
We proposed that a significant first step toward improving overall usability would be to focus on the process of user-centered design (UCD). While valid and reliable usability measurements exist, including those specified in NISTIR 7804 “Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records,” we expressed that it would be inappropriate for ONC to seek to measure EHR technology in this way.
Really? Well, then, just who the heck is gonna require it?

Readers might recall that I had a bit of irreverent sport with ONC's 2011 HIMSS Conference "Usability" presentation.


18 months ago. I still just love the "Rigorability" thing.

More snips.
One commenter noted that usability is a quality of interactive software
that can be objectively defined and evaluated. This commenter suggested that we adopt the following standards for EHR technology certification: Standard 13407, UCD/NISTIR 7804, ISO Standard 25062, and Common Industry Format for Summative Usability Tests NISTIR 7742.  [pg 94]
...One commenter expressed support for the certification criterion, but
disagreed with the assumption that user interface (UI) validation testing must be performed by end-users. This commenter’s experience was that UI validation tests performed by internal design experts are more effective than the same testing performed by end-users. This commenter reported that engineering a UI to the needs of a user who is encountering that interface for the very first time, invariably results in an interface designed to accommodate the novice, at the expense of denying power and efficiency to the same user who will quickly gain familiarity with a well designed interface.  [pg 98]
  One commenter suggested that ensuring usability is the key to successful physicianadoption of EHRs, yet expressed concern that our proposals as drafted gave no consideration as to the clinician decision-making process or practice workflow.
One commenter expressed concern that the adoption of a particular methodology does not guarantee that software will improve. Other commenters suggested that the testers would need to be selected who are professionals already familiar with more than one EHR technology and are in the same specialty as the target market of the EHR technology developer. [pg 99]
___
Response. We thank these commenters for their thorough and thoughtful feedback. Although the implementation of suggestions 1 through 5 may provide a better understanding of EHR usability today and chart a path toward improved usability in the future, they fall outside the scope of this certification criterion. We have not included NISTIR 7804 in the 2014 Edition EHR certification criteria, but may consider it for future editions of certification criteria. We do believe that UCD will – by definition – consider the clinical decision-making process and disagree with the commenter that it does not. Finally, we agree that both formative and summative testing are valuable, and we agree that testing in a lab setting and testing in the field are also important. This certification criterion is a first step toward formal usability testing becoming part of the culture of EHR technology development. We therefore clarify that, at a minimum, only lab-based summative testing is necessary to be performed in order to demonstrate compliance with this certification criterion. Nothing precludes field-testing and formative testing from also being performed and we encourage EHR technology developers to do so.
Quality Management System
In the Proposed Rule we noted that the IOM had also recommended that we “[establish] quality management principles and processes in health IT.” We stated that, working with other Federal agencies, we intended to publish a quality management document that would be customized for the EHR technology development lifecycle and express similar principles to those included in ISO 9001, IEC 62304, ISO 13485, ISO 9001, and 21 CFR part 820. We anticipated that this document would provide specific guidance to EHR technology developers on best practices in software design processes in a way that mirrors established quality management systems, but would be customized for EHR technology development We stated that we understood that some EHR technology developers already have processes like these in place, but did not believe, especially in light of the IOM recommendation, that the EHR technology industry as a whole consistently follows such processes. We indicated our expectation to publish the quality management document around the same time as the Proposed Rule would be available for public comment. We indicated that we were considering including an additional certification criterion in the final rule that would require an EHR technology developer to document how their EHR technology development processes either aligned with, or deviated from, the quality management principles and processes that would be expressed in the document. We emphasized that this certification criterion would not require EHR technology developers to comply with all of the document’s quality management principles and processes in order to be certified. Rather, to satisfy the certification criterion, EHR technology developers would need to review their current processes and document how they do or do not meet the principles and processes specified in the document (and where they do not, what alternative processes they use, if any). We stated our expectation that this documentation would be submitted as part of testing and would become a component of the publicly available testing results on which a certification is based. [pp 100-101]
You got all that? In other words, _____________________________.

Gotta love this FR excerpt as well:
Many commenters expressed support for our proposal adding, in many cases, arguments about the critically important role that usability plays in the aspect of the safety and reliability of EHR systems, noting that if usability is not carefully analyzed it can cause design induced errors. Other commenters were clear that they felt the results of UCD and quality systems testing should not be made publicly available, and that doing so would open the door for EHR developers’ intellectual property to be misappropriated. Some commenters were simply opposed to this criterion, citing an unnecessary burden on the industry. [pg 92]
Ok...

The hits just keep on comin'...
We encourage EHR technology developers to choose an established QMS [Quality Management System], but developers are not required to do so, and may use either a modified version of an established QMS, or an entirely “home grown” QMS. We also clarify that we have no expectation that there will be detailed documentation of historical QMS or their absence. As specified above, we believe that the documentation of the current status of QMS in an EHR technology development organization is sufficient. [pp 107-108]
"Home Grown"?

I have to wonder whether calls for the FDA to regulate HIT will arise anew in the wake of a ruling such as this. I know they don't want that headache, but, still.

BTW:


"Price transparency?" This will be much loved. Be interesting to see how the more opaque vendors will comply with (or finesse) this requirement.
___

Not to worry about any of this. Clinic Monkey EHR is on it.


My new VP of Software QA.

___

RAN ACROSS THIS AT KAISER HEALTH NEWS
Market For EHR Replacements Is On The Rise
AUG 14, 2012

MedscapeHalf Of EHRs Sold Are Replacements
Use of electronic health records is snowballing, and so is the number of unhappy users. Half of EHR systems sold to physician practices are now replacements, up from 30% last year, according to a recent study by research firm KLAS.  The leading reason for switching systems, cited by 44% of practices, is product issues. Service issues group consolidation — such as when a hospital converts newly hired physicians to a new EHR — are a distant second and third (Lowes, 8/13).
"Product issues."
___

BRIEF SEMI-TANGENT

Notice of this LinkedIn Healthcare Executives Network post recently arrived in my inbox.


Well, I shot this on my iPhone as I was leaving CostCo yesterday.


And we wonder...

1/4 lb hot dog, 20 oz Coke, and Very Berry Sundae, $3.15 (probably add in some fries as well, 'eh?).

CABG px ~$63,000.

I ALWAYS LOVES ME SOME JD KLEINKE

J.D. Kleinke says:
August 26, 2012 at 9:54 am
No capitalist ideology here, folks, just calling the game like I see it. I think the health insurers and their shareholders would be better served minding their core business: managing both the health status and costs of commercially feasible populations. No fancy tricks or special press releases for that – just lots of hard work: adjudicating claims, rooting out fraud, aligning payment with evidence, managing provider and patient adherence to that evidence, tracking outcomes, and steering people toward better hospitals and doctors – all those great ideas from the past two decades everyone likes to talk about but almost no one really ever gets around to executing. "Fools' Gold Rush: ObamaCare and the Medicaid 'Opportunity'."

Indeed. No, let's just keep doing Work About The Work About The Work. All while we watch a tsunami of Innovative Lean Agile Six Sigma Health 2.5 Radically Transformative  Startup Apps wash over and deliver us.

This Kleinke observation just zings:
In normal businesses, with willing buyers and sellers and functioning marketplaces, enormous revenue opportunities do not necessarily translate into commensurate opportunities for profit. And Medicaid is about as far from a normal business as one can imagine. It is the emergency room for our worst chronic social problems. Illiteracy, drug addiction, broken families, migrant labor, illegal immigration, teen pregnancy – you name it, and Medicaid gets to deal with it. Medicaid programs attempt, mostly through heroic individual efforts, to serve a desperately needy population of the poor, chronically ill, mentally unstable, and recklessly pregnant. They do so by overworking and underpaying the nation’s most aggrieved providers, gouging drug companies, and transferring costs wherever they can to the rest of the system.
JD pretty much rocks. See my earlier post "Use Case" citing him.
___

THE HPID

Largely obscured by the hubbub of Stage 2 and CEHRT was the release of this (pdf) on Friday:

Administrative Simplification: Adoption of a Standard for a Unique Health Plan Identifier; Addition to the National Provider Identifier Requirements; and a Change to the Compliance Date for the International Classification of Diseases, 10th Edition (ICD-10-CM and ICD-10-PCS) Medical Data Code Sets 

AGENCY: Office of the Secretary, HHS.

ACTION: Final rule.

SUMMARY:

This final rule adopts the standard for a national unique health plan identifier (HPID) and establishes requirements for the implementation of the HPID. In addition, it adopts a data element that will serve as an other entity identifier (OEID), or an identifier for entities that are not health plans, health care providers, or individuals, but that need to be identified in standard transactions. This final rule also specifies the circumstances under which an organization covered health care provider must require certain noncovered individual health care providers who are prescribers to obtain and disclose a National Provider Identifier (NPI). Lastly, this final rule changes the compliance date for the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) for diagnosis coding, including the Official ICD–10–CM Guidelines for Coding and Reporting, and the International Classification of Diseases, 10th Revision, Procedure Coding System (ICD–10–PCS) for inpatient hospital procedure coding, including the Official ICD–10–PCS Guidelines for Coding and Reporting, from October 1, 2013 to October 1, 2014.
I'm all for "No Dupes-No Nuls" unique identifiers.
___

MONDAY MORNING BREAKING NEWS

AIRSTRIP EXPANDS mHEALTH PLATFORM
WITH REAL-TIME MOBILE MEANINGFUL USE TRACKER
Vendor-neutral solution mobilizes and measures clinical quality, meaningful use compliance
SAN ANTONIO and SAN DIEGO – August 27, 2012 – AirStrip Technologies, Inc. has announced an expansion of its leading mHealth platform to incorporate the Meaningful Use Tracker, giving clinicians the ability to follow and measure clinical quality and meaningful use (MU) compliance in a single mobile solution - including real-time tracking and daily analytics updates in a customizable dashboard format...
That's pretty cool.
___

AUGUST 28TH UPDATE


John Halamka, MD, on Stage 2:
Some complained about the real world operational impact of the workflow changes implied by MU Stage 2.

The Standards Committee is compromised of world class professionals who implement systems for a living. Their advice (especially that of the Implementation Workgroup) is from the trenches.

My honest opinion is that MU Stage 2 creates stretch goals for vendors, IT departments, and providers, but all are achievable. MU Stage 2 lives up the metric first articulated by David Blumenthal - the escalator should move up fast, but not so fast that people fall off.
Difficult to disagree with that.
___

YouTube of today's webinar
Final Rules: Overview of MU2 and the Standards and Certification Criteria 8-28-12


A couple more FR errata of particular interest to me -- "encryption of data at rest":

We are making a change in this final rule to the language of "data at rest" to specify our intention of data that is stored in CEHRT. After consideration of the public comments, we are finalizing the meaningful use measure as "Conduct or review a security risk analysis in accordance with the requirements under 45 CFR 164.308(a)(1), including addressing the encryption/security of data stored in CEHRT in accordance with requirements under 45 CFR 164.312 (a)(2)(iv) and 45 CFR 164.306(d)(3), and implement security updates as necessary and correct identified security deficiencies as part of the provider's risk management process" for EPs "at §495.6(j)(16)(ii) and eligible hospitals and CAHs at §495.6(l)(15)(ii).

We further specify that in order to meet this objective and measure, an EP, eligible hospital, or CAH must use the capabilities and standards of CEHRT at 45 CFR 170.314(d)(1) through 170.314(d)(8).

MU Stage 2 FR pg 136
__

Comment. A commenter stated that they considered information that has been sent to a print queue or downloaded by the user (such as downloading a PDF report) to no longer be managed by the EHR technology.

Response. We generally agree with this statement.

Comment. A commenter asked that we clarify whether data at rest on a server located at a secure data center must be encrypted and, if yes, to please reconsider this requirement because they believed it would slow down response times in large cloud-based EHR systems.

Response. As indicated above, this certification criterion does not focus on server-side or data center hosted EHR technology. We recognized that these implementations could employ a variety of different administrative, physical, and technical safeguards, including hardware enabled security protections that would be significantly more secure than software oriented capabilities.

2014 CEHRT FR pg 284
 Well, pretty weak, IMO.
___

P.M. UPDATE:


About Technology Crossroads

National eHealth Collaborative in partnership with InfoComm International, is proud to announce the inaugural Technology Crossroads Conference, which will take place in Washington, DC on November 27 and 28, 2012.

The first of its kind, this conference will explore the intersection between audiovisual (AV) and health information technologies (IT) to spotlight the many ways in which cutting-edge AV technologies and health IT breakthroughs are working together to accelerate healthcare transformation...
Interesting. I'd love to attend that. Pricey, though.

Pondering joining as an indivdual ($100). But, I had to chuckle at this:
Join NeHC

To become a NeHC member, please fill out the application at the link below and return it to Claudia Ellison, Director of Development, at cellison@nationalehealth.org.

NeHC Membership Application

Membership applications are subject to the review and approval of the NeHC Membership Committee.  NeHC will strive for balanced, multi-stakeholder membership participation.  To this end, if needed, NeHC will work with applicants on an individual basis to agree on annual membership dues if there are special circumstances or an inability to pay based on the dues structure.  Please contact Claudia Ellison at cellison@nationalehealth.org.
OK, an "eHealth" organization requesting that you download, fill out, and return a Word document. Sounds like some of the stuff we do.

CLINIC MONKEY EHR UPDATE

We're on a hiring blitz.


She wants to see your risk analysis results and mitigation plan, period.
___

WEDNESDAY MORNING QUICK NOTE

Tell us again, why are there MU penalties?
August 29, 2012 | Jeff Rowe, Editor, EHRWatch

As we noted yesterday, healthcare providers and consultants are in the process of deciphering the implications of MU Stage 2.

But while stakeholders wade through what one consultant nicely summed up as the “compromise and complexity” of the current program, we’re pondering, once again, one of the attributes of the MU program that policymakers still don’t seem willing to question.

To come right to it, here’s the language from the CMS fact sheet:

“Medicare payment adjustments are required by statute to take effect in 2015 (fiscal year for eligible hospitals/calendar year for EPs). The rule finalized a process in which payment adjustment will be determined by an EHR reporting period prior to the payment adjustment year 2015. Any Medicare EP or hospital that demonstrates meaningful use in 2013 will avoid payment adjustment in 2015. Also, a Medicare provider that first demonstrates meaningful use in 2014 will avoid the penalty if they successfully register and attest to meaningful use by July 1, 2014 (eligible hospitals) or October 1, 2014 (EPs).”

Not surprisingly, some stakeholders are voicing displeasure at the timing of the penalties...
Read the entire post here.
___

More to come...

Wednesday, August 22, 2012

2012 Meaningful Use Year One Attestation deadline draws nigh

Table 1
Failing to meet a 2012 90-day year one attestation will cost you $5,000 per EP, 60% of which will come out of your 2013 check.


CHALLENGE.GOV: WHAT'S IN YOUR PHR?

My iMovie entry.


Short and to the allusive point. No Talking Head or v/o. The music is a royalty-free GarageBand loop from my GarageBand library.


Shot that on my kitchen counter the other day with my iPhone. No need to drag out the heavy artillery (my Sony alpha 500 DSLR).

UPDATE:

My submission was rejected out of hand as "ineligible." "Not specific enough regarding PHR benefits."

Well, that was quick. Submission deadline is the 23rd. They have all of 19 submissions to date, as of the night before.

Whatever.

UPDATE UPDATE: total final deadline submissions, 29. Some pretty good ones in there.
___

"WHAT IS AN i-HUB?"

One quick non-geek nominal analogy I thought of.


i.e., one clinic "speaks" eCW to an e-MDs clinic, which then relays the translated information on to a specialist whose EMR "speaks" Primesuite... etc, etc, etc. Should every datum in the transmitted PHI be amenable to HL7 I/O unmodified, no problem.

In theory. I would think extensive 360-degree testing is warranted. I'm sure there have to be people out there doing this sort of thing for "authentication / data integrity" compliance.
___

GOING PEDANTIC ON TWITTER

The definition of "Quixotic," I suppose.


Data "are." OED can kiss off. Almost no one says "you're welcome" anymore, either (instead, it's "Thank YOU").

Pedantic Curmudgeon.

Yeah, I went over to The Dark Side finally. @BobbyGvegas. Not sure about that whole thing yet, but you do get some good breaking HIT news, and a ton of HIT and other health care related companies and individuals using it. "Social Media Marketing" and all that.
___

ONC HITRC CoP WEB CALL TODAY


Uh, OK...
___

Shortly:
Compliance Guidelines for Financial Institutions in the Healthcare Sector:

Executive Summary
The recent passage of the Health Information Technology for Economic and Clinical Health Act (HITECH) directly affects financial institutions and their services for the healthcare sector. HITECH modifies and amplifies the existing data privacy and security rules for protected healthcare information under the Health Insurance Portability and Accountability Act (HIPAA). There are new breach reporting requirements and tougher penalties. Financial institutions may find they must be able to meet the HIPAA data privacy and security measures if they deliver services to the healthcare sector...

Another 67 pages of HIMSS light bedtime reading (pdf).

In other news...
HIStalk Advisory Panel: IT in Patient Harm, Patient Outcomes

The HIStalk Advisory Panel is a group of hospital CIOs, hospital CMIOs, practicing physicians, and a few vendor executives who have volunteered to provide their thoughts on topical industry issues. I’ll seek their input every month or so on an important news developments and also ask the non-vendor members about their recent experience with vendors. E-mail me to suggest an issue for their consideration.

If you work for a provider organization (hospital, practice, etc.), you are welcome to join the panel. I am grateful to the HIStalk Advisory Panel members for their help in making HIStalk better.

What are the biggest lessons we’ve learned from cases where IT contributed to patient harm?

Common Themes Expressed

  • System redundancy is sometimes poorly planned.
  • Systems and system changes (especially those involving upgrades and application setup) are not adequately tested.
  • IT systems management needs to be more formalized (change management, communication, quality assurance).
  • System design should be user-centered and should make it easy for clinicians to do the right thing.
  • User application training needs to be not only more comprehensive, but also tied to the workflow and job role changes that are involved.
  • Clinicians are not represented in the IT governance process for changes that are seen by IT as purely infrastructure related.
  • Clinicians need to take ownership of workflow analysis and get involved in IT projects that affect them and their patients.
  • IT is specifically related to patient harm or patient safety – it’s an enabler of management and processes, whether good or bad. Technology is not a panacea.
  • Clinicians can’t let the computer override their critical thinking, yet computer systems encourage them to.

Which hospital uses of IT have driven the biggest improvement in patient outcomes?

Common Themes Expressed

  • Hospitals need to define their quality goals, track their baseline quality, and then go after improvements.
  • Real-time alerts and notifications can affect patient outcomes dramatically.
  • Population health analytics can drive some of the biggest improvements beyond systems that just affect inpatient stays.
  • Well-defined and closed areas have the most impressive IT-driven improvements: ED, pharmacy, and OR.
  • Pharmacy-related IT has driven major patient care improvements: electronic medication administration record, barcode checking of drugs at the bedside, alerts for drug-drug interactions and other patient-specific problems.
  • Telemedicine makes it possible to use hard to find expertise more broadly.
  • PACS has dramatically changed how clinicians use diagnostic images and how radiologists work.
  • Data analysis can pinpoint areas of potential improvement and allow ongoing monitoring.
  • Technologies, even simple ones, that allow clinicians to communicate more effectively can have a significant patient impact.

A lot to think about here.
___

AUG 23rd UPDATE: 
CMS STAGE 2 AND 2014 CEHRT FINAL RULES RELEASED


1,146 pages total (474 and 672 pages respectively). Just started combing through them (yeah, cool and all, but, what I really want to see is the HIPAA Omnibus Final Rule).

Took up a 4" 3-ring binder (2-sided printing).

A NOTICE WE GOT TODAY:
How to Play by the (Final) Rules:
An Overview of Meaningful Use Stage 2 & the Standards and Certification Criteria Final Rules
Aug 24th, 1:00-2:30pm EDT

After months of speculation, the final rules for Meaningful Use Stage 2 and the Standards and Certification Criteria have been released. Friday, August 24 at 1 pm EDT, NeHC will host experts on both of these rules for a single webinar to walk through the intricacies and answer questions on each rule. First, NeHC will be joined by Rob Anthony from CMS’s Office of E-Health Standards and Services to give an overview of the final rule and answer attendee questions. Rob will also discuss the feedback that was incorporated into the final rule and what this means for those ready to attest for both Stage 1 and Stage 2. Then, NeHC will once again welcome Steve Posnack, Director of the Federal Policy Division at the Office of the National Coordinator for Health IT (ONC) to join us for an in-depth look at the 2014 Edition Standards and Certification Criteria final rule. Steve will discuss the revised definition for Certified EHR Technology, identify changes from the proposed rule, and give his insight into the next steps.

If you cannot attend Friday's program, NeHC will be repeating this program next week. Please watch your email for updates on upcoming dates and times of the repeat webinars.
Faculty:

Rob Anthony - Office of eHealth Standards and Services, CMS
Steve Posnack - Director of the Federal Policy Division, ONC

PROGRAM INFORMATION
Website: http://www.nationalehealth.org/FinalRules
Registration required*
Fee: No Fee
__

Should be interesting. We'll be attending.

apropos of Stage 2 and the concomitant EHR re-cert requirements, I screen-scraped some data off a CMS page today and dropped them into Excel and cleaned them up a tad:

Table 2
Addressing just the Medicare EP side for a moment: Had you been sufficiently adroit to have attested in 2011, you will now get to glide through Stage 1 through 2013 (assuming effective ongoing MU dashboard vigilance), at the end of which time you will have been eligible for 86.4% of the total $44k incentive reimbursement potential (the max amount contingent, of course, on your annual allowable Part-B claims). You will then have to upgrade to a Stage 2 certified system and amend your workflows to comply with Stage 2 criteria to collect the remaining relatively net piddly $6k ($4k in 2014 and $2k in 2015). Then, in 2016, you'd be starting Stage 3 -- for no ensuing incentive money (and, you can be sure the vendors will charge you for both upgrades).

If you're attesting in 2012, your upslope path is even more challenging; your 5th year of participation puts you in Stage 3 in 2016. You're gonna pay for two upgrades and do two workflow/training revisions.

If you miss 2012, and have to start attestation in 2013 (a real possibility for some of the more unruly clients on my personal caseload), well, the case for even doing it gets even more difficult (see Table 1 at the top of this post).

Moreover, absent some sort of renewed funding sources, RECs are effectively finished this time next year, So, all Brave REC Supportive Talk aside, you're gonna be on your own, or paying fee-for-service (which, again, begs the ROI question, given a much bandied-about price point of ~$125/hr).

ONC is paying lip service to REC support, but that's about the extent of it, as far as I can determine. But, in fairness, it's not a propitious time to be going to the Hill or to the administration groveling for more REC money. Moreover, should we have a change in the White House this November (a real possibility, IMO), all bets will be off -- "bipartisan support for HIT" kumbaya talk notwithstanding.


No subsequent news. No web content. No blog. No LinkedIn content. No Facebook. No Twitter. Nada, Zip, Zilch. I guess the "Trade Association" strategy here escapes me. I would have had all of that stuff and more tee'd up on Day One. Prior to Day One, actually. It's.Not.That.Difficult.

Don't let any mold accrue on your CVs.
___

AUG 24th EARLY BIRD UPDATE:
INTERESTING ARTICLE
Health Care: An Alternate Economic Universe
(Jeff Goldsmith, props to TCHB)

...Health care in the US is changing, and becoming more disciplined, team-based and protocol driven. However, the culture of the US health system has changed yet very little. The primordial impulse is to add more (and more expensive) workers whenever new problems need to be solved or new technologies appear, heedless of the expense.
Hospital executives continue speaking wistfully and inaccurately of “reimbursement” as the source of their revenues. This retro word conveys the distinct subtext that they have no responsibility for the cost of their product, that money has been spent, and someone owes them “reimbursement” for it. The proper term is “payment”, and the operative societal question is “are we receiving value for money?” in that payment.
 However, most of the new payment models under intense scrutiny — from accountable care, to bundled payment, to “ambulatory intensive care” for dual eligibles, etc. — only pay off if they markedly reduce, particularly, hospital use. Despite a (slowly) aging population and (hopefully) better access through health reform, the trend line for use of our most expensive health resources will likely turn downward as we reduce avoidable use of our system’s most expensive resources.

An Unsustainable Status Quo

But the cost of health care that remains is still far too high to be affordable long term. Those costs will only be reduced by better coordinated care, and by marked improvements in clinical and organizational productivity, a revolution these data suggest has yet to begin. The supply side of the US health care system remains impressively insulated from cost pressure, and focused on the myriad challenges of growth and revenue enhancement.
At some point on the path to deficit reduction, gravitational forces will assert themselves. Policymakers can assist in that process by re-examining the economic logic of the transactional density and documentation burden they are imposing on caregivers. We will know that economic pressure on the health system has reached a decisive juncture when health sector employment stabilizes or reverses course, and health care providers join the rest of the economy in seeking improved productivity and product quality as necessary strategies to survive.

Ya think? That's Toussaint 101, is it not? (See Potent Medicine and On The Mend.)
___

SUMMARY MU2 SLIDES I FOUND




Click images to enlarge.
___

More to come...