Search the KHIT Blog

Tuesday, June 28, 2011

Meaningful Manual Labor

A few EHR vendors have begun to distribute detailed Meaningful Use upgrade instruction manuals (via PDFs -- I'm becoming rather adept with the comb binder). A couple are very good, essentially A to Z workflow instructions (replete with "money field" screen shots) for hitting all of the Stage One measures.

Now all we need are about 30 more of these, given that we at the HealthInsight REC are "vendor neutral" and have to support providers on close to three dozen ONC certified platforms.

e-prescribing is a really big deal at HHS. We have the Meaningful Use Core Measure piece, PQRS, and the MIPPA piece. e-Rx purports to bring a host of improvement/cost reduction benefits.


Cautionary reporting from Bloomberg:
“...Providers appear to be rapidly adopting electronic health records and computerized prescribing, and one of the major anticipated benefits is expected to be through medication-error reduction,” the researchers wrote. “Many of these benefits will not be realized if the electronic prescribing applications are not mature and either do not catch or even cause new medication errors.”

Wrong Dose
The most common error was the omission of key information, such as the dose of medicine and how long or how many times a day it should be taken, the researchers said. Other issues included improper abbreviations, conflicting information about how or when to take the drug and clinical errors in the choice or use of the treatment, the researchers said.

“With more than 3 billion prescriptions written annually in the U.S. alone, this could amount to 385 million errors each year, with 128 million of them having the potential to cause patient harm, said researcher Jeffrey Rothschild, from the center for patient safety research and practice at Brigham and Women’s Hospital in Boston..."

No shortage of improvement work to be tended to. "Usability," anyone? Beyond workflow improvement more generally.


My last couple of posts have dwelled at some length on ePHI privacy and security issues. Check this out: Beltway veteran KPMG was just awarded a $9,179,011 contract to administer ARRA-HIPAA audits for the HHS OCR.
The protocol and audit program performance requested under this contract shall assist OCR in operating an audit program that effectively implements the statutory requirement to audit covered entity and business associate compliance with the HIPAA privacy and security standards as amended by ARRA...

...The government anticipates completing 150 audits of entities varying in size and scope...

$9,179,011 divided by 150 = $61,193.41 per audit.

I'm in the wrong business.

NIST, ONC plan measures, testing to improve health IT usability
Posted on April 25, 2011 by Theo Mandel, Ph.D.

I’ve worked in healthcare usability for a long time and with the impetus to move all of healthcare to electronic platforms, there have been many, many unusable implementations of EHRs and EMRs.

The National Institute for Standards and Technology (NIST) and Office of the National Coordinator for Health IT (ONC) are working to provide guidelines for healthcare software usability...

au contraire

The following was submitted to ONC during their public review and comment period for their Usability certification proposal.
Usability Testing for EHRs is a Bad Idea
May 6, 2011 – by Richard M. Low, MD, CEO of Praxis® Electronic Medical Records

As a physician and as the developer of Praxis EMR, a product that has consistently ranked #1 in user satisfaction, I am concerned about the government's plans for EHR usability testing.

The EHR Usability Testing Plan you have submitted for public comment contains a fundamental flaw. The problem is that you have equated what usability means to doctors with what ease-of-use means to software developers. You begin with an observation that is on target (usability has been the #1 barrier to EHR adoption) and proceed with a well-intentioned but unwise goal (to use the power of government to improve the usability of EHRs). From there, you propose a testing approach that is based on several misconceptions.

Up till now, you deserve praise for your agency's excellent work and for your transparent decision-making process, especially considering the extreme pressures under which the agency is working. Indeed, you have done precisely what government should do: help create order out of industry chaos by setting standards; obtain critical information for public health purposes; and protect the public in terms of information security.

Equally impressive is what you decided not to do, which is to delve into product workability. This decision demonstrated an understanding on your part that the industry is in full innovation and development mode, and that rulings of this sort would harm rather than help the development of excellence in Electronic Medical Records.

It would be a shame to tarnish your exceptional record thus far by venturing into the dubious area of usability testing.

No doubt the move in this direction is prompted by the large number of unusable EHRs in the market and by the great physician dissatisfaction with many of the solutions out there. Further, computer experts have assured you that usability testing is not only possible, but an effective solution to the problem. And so the government is now seriously considering a program that will rate EHR usability.

Those of us who have been in the industry long enough remember CCHIT's attempt to do the same thing. The ratings that it issued were highly suspect, with doctors giving damning reviews to EHRs that received CCHIT's seals of approval.

To give credit where credit is due, you begin by defining usability in terms that we can all agree with:

The effectiveness, efficiency, and satisfaction with which the intended users can achieve their tasks in the intended context of product use.

From there, however, you create a testing approach that is based on several misconceptions.

First, you assume that an EHR is a "computer thing." It is not. It is a medical tool. To think of an EHR as a "computer thing" is akin to the early days of the printing press—500 years ago—when a book buyer would ask a printer for his opinion on a tome. The printer would then examine the cover, the parchment, and the binding, and pass judgment on the book's worth. Today, we all know not to judge a book by its cover. Nonetheless, by advocating usability testing, the government is creating an analogous situation; when it comes to EHRs, usability experts are the "printers" of the Information Age.

And because of the assumption of EHRs as computer things, you arrive at your second misconception, which is that EHRs should reduce their learning-curves by following the lead of commonly-used software programs. In other words, since most people who have used a computer are familiar with how pulldown menus and templates work, EHRs should go down this well-travelled road. Doing so would make the use of EHRs intuitive, or so the thinking goes.

In effect, you are applying computer thinking to a medical tool. This is a bad idea.

In this short paper, I would like to share with you my experience with EHR usability and dispel the assumptions that underlie the proposed usability testing of EHRs. My views are based on what I have learned from the development of Praxis for over 20 years, and from studying hundreds of clients who use our EHR. In addition, these views are informed by my 20 years as a Board Certified Internist and by my experiences in the areas of Emergency Medicine and Primary Care.

Templates Do Not Work
I am rather fond of Robert Frost, and to paraphrase a line from one of his most famous poems, at Praxis, we decided to go down the road less travelled by.

The first EMRs came out when I was working as a physician in Los Angeles. As an amateur computer programmer, I have always been intrigued by computers and their potential to dramatically improve the practice of medicine. I immediately came to the conclusion, however, that these first EMRs were headed down the wrong road. They were all based on templates, replete with drop-down menus and pick lists.

It has now been widely-reported in medical journals that the use of templates in EMRs pose several challenges. First, they slow down the doctor by requiring him or her to click through menus and find the right template for the case at hand. Second, they lead to increased medical and coding errors; this has been amply documented and you may refer to articles such as Hidden Malpractice Dangers in EMRs in the Medscape Business of Medicine, The Problems with EHRs and Coding in Modern Medicine, and Compliance Risks Grow with Electronic Medical Record Systems in AIS Compliance, to list just a few from the growing body of literature on just this one issue. Third and perhaps most important, templates restrict the way doctors think through a medical case; see, for instance, Dr. Jerome Groopman's book, How Doctors Think and two articles from the New England Journal of Medicine, Off the Record—Avoiding the Pitfalls of Going Electronic and Can Electronic Clinical Documentation Help Prevent Diagnostic Errors?

My disappointment with the early EMRs led me to found Praxis Electronic Medical Records. I assembled a team of developers, explained what an EMR should do, and gave them one prohibition: templates were not to be used.

I also gave them two guiding principles: 1) the noise to information ratio must be optimal, letting the user teach the program which is which, not the other way around; and 2) the program should never make any attempt to "practice medicine," but rather strive to follow the doctor's lead.

The Road Less Travelled By
Praxis EMR has been on the market for over 20 years now. It has consistently ranked #1 in user satisfaction surveys conducted by the American Academy of Family Physicians and in KLAS.

However, I suspect we would do poorly in usability under your proposed testing approach.

Praxis EMR does not look like other EMRs. It is not template-based and it does not utilize the logic of computer science. Rather, we use the logic of medical thinking as our guiding principle.**

Doctors do not all think through a medical case the same way. In fact, there is a certain art to the practice of medicine. Doctors, therefore, must be given full liberty to express themselves when they document a medical case. An EHR, then, should give them the flexibility to use their own words and to input information in the order they prefer.

Praxis gives them this freedom. The heart of Praxis is a neural network engine that learns directly from the user. As a doctor inputs his or her medical concepts, Praxis remembers them and presents them to the doctor the next time a similar case comes up. As Praxis presents these concepts, the doctor can edit them if the situation calls for it. Praxis then remembers this revised concept, as well. Over time, the need for editing is drastically reduced and soon a doctor is documenting cases in mere seconds.

In the proposed testing environment, however, Praxis would be just beginning to learn from the tester and its full potential would not be realized.

The Proper Measure
I submit to you that the most reliable measure of an EHR's usability is user satisfaction. Of course, user satisfaction is not the same as usability, but it can not be denied that the two are directly linked. Usability leads to user satisfaction. Therefore, instead of wrestling with the proper way to measure usability, the government would better serve the medical profession by measuring user satisfaction based on the real world experiences of doctors using different EMR products. This has already been done successfully by medical associations such as the American Academy of Family Physicians and by research firms like KLAS.

I understand the pressures you must be under to mitigate the terrible press that EMRs have received in general. But usability testing is not the solution. If you go down this path, you will pave the road to mediocrity. Fantastic, innovative ideas presently percolating in the EMR community would likely never see the light of day. The industry would kowtow to your lead, shelving innovative ideas to focus solely on scoring the highest on your usability tests. We have seen this happen before with CCHIT.

It is not the government's role to determine EHR usability. Presently, there are innovative EHRs in the market receiving good press and high marks on usability. Do not burden them with processes and requirements that will lead them to mediocrity. Rather, let the real experts, the medical professionals who use EHRs as part of their important everyday work, voice their preferences in the market.

Please trust the market!

A lot to ponder. Including, tangentially, apropos of ** "
the logic of medical thinking as our guiding principle..."

See also my prior "cognitive traps."


July 16, 2011
Seeing Promise and Peril in Digital Records

TECHNICAL standards may seem arcane, but they are often powerful tools of economic development and social welfare. They can be essential building blocks for innovation and new industries. The basic software standards for the Web are striking proof.

Safety is also a potent argument for standards. History abounds with telling examples, like the Baltimore fire of 1904. That inferno blazed for 30 hours, destroying more than 1,500 buildings across 70 city blocks. Fire engines from other cities came to help, but could not. Their hose couplings — each a different size — did not fit the Baltimore fire hydrants. Until then, cities saw little reason to adopt a standard size coupling, and local equipment manufacturers did not want competition. So competing interests undermined the usefulness of, and investment in, the technology of the day.

Today, the matter of standards for electronic health records is raising similar concerns, prompting heated debate in recent meetings of representatives from medicine, industry, academia and government. The stakes, they say, could scarcely be higher. They agree that, when well designed and wisely used, digital records can deliver the power of better information to medicine, improving care and curbing costs. But computer forms, they add, can also be difficult to use, cluttered and distracting, causing more harm than good in health care.

“This is an issue that potentially affects the health and safety of every American,” says Ben Shneiderman, a computer scientist at the University of Maryland.

The controversy points to the delicate balancing of interests involved when creating technical standards that inherently limit some design choices yet try to keep the door open to innovation. It also raises the question of the appropriate role for government in devising such technology requirements.

At issue is the Obama administration’s plan to develop standards to measure how effective and easy digital patient records are to use — applying a research discipline known as human-computer interaction or human factors. (The International Organization for Standardization, which is based in Geneva, defines the usability of a product by three attributes: “effectiveness, efficiency and satisfaction.”)

The need to improve the usability of computerized records is clearly evident — and has been for some time. A 2009 study by the National Research Council, an arm of the National Academy of Sciences, found that electronic health record systems were often poorly designed, and so could “increase the chance of error, add to rather than reduce work flow, and compound the frustrations doing the required tasks.”...

Interesting. See also
So Many EHRs. So Expensive.

There are currently 386 software packages certified by an ONC approved certification body as ambulatory Complete EHRs, which means that the software should allow the user to fulfill all Meaningful Use requirements and possibly qualify the proud owner for all sorts of CMS incentives. There are 204 more software packages which are certified as ambulatory EHR Modules, and a proper combination of these packages could result in a Complete product, which if used appropriately could lead to the same fortuitous results...

More to come...

Saturday, June 18, 2011

Use Case

The phrase "use case" is really simply software engineering jargon for "example," i.e., an example of how a user interacts with a software application to reach a goal. What follows is a sort of an emblematic HIT failure "use case" setup for J.D. Kleinke's "Dot-Gov: Market Failure And The Creation Of A National Health Information Technology System" (PDF). Given that I live in Las Vegas and work for our REC (adjunct to which we are now aggressively moving forward with a Health Information Exchange (HIE) initiative), I found this quite interesting.
JOE WILSON, a thirty-eight-year-old software engineer from Pittsburgh on his first trip to Las Vegas, is feeling out of sorts as he walks into the casino. The free drinks help his mood, and he discovers after his third that he has a serious gambling problem. At the casino’s cage, where cash, credit card capacity, and creditworthiness are turned into chips, Joe liquidates the $6,324 in his checking and savings accounts and another $8,121 from his credit cards. Twenty hours and ten drinks later, this money runs out on the craps table, but Joe secures an- other $24,983 in cash, which represents 40 percent of his retirement account and 30 percent of the equity in his home. The casino was able to find out this information about him—as well as his marital status, the names of his last three employers, the number of years he has lived at his current address and worked for his current employer, the lapsed status of an earlier life insurance policy and the paid status of another, and the absence of liens against his assets—in less than five minutes, based on his name and Social Security number. In twelve more hours, Joe has burned through the last of his cash. Then the chest pain starts.

Leaving Las Vegas. Joe is rushed to a community hospital a few blocks from the casino. He is drunk, dizzy, and disoriented and cannot give the emergency room (ER) doctors any information about his medical history. But he is able to produce a tattered insurance card from his wallet, which includes his Social Security number, listed as his “Member ID.” The hospital’s admissions clerk spends twenty minutes on the phone to confirm that Joe is indeed covered by the health insurer in Pittsburgh and that she should collect up to $500 from Joe for his visit; but the insurer’s “information specialist” cannot say exactly how much, because he cannot tell from “the computer” if Joe has met his deductible yet that year. He can tell the hospital nothing else about Joe—not his medical history, the names of his physicians, or any medications he might be on—because “the computer” has no other information about him. That information is “in the other computers.”

Joe’s condition worsens; the ER physician diagnoses a heart attack and prescribes intravenous metoprolol, a generic beta blocker. What she does not know is that until a month earlier, Joe had been taking 20 milligrams per day of Paxil (paroxetine) for depression. But a month before Joe’s trip to Vegas, his employer’s health plan had switched to a new pharmacy benefit management (PBM) company, which required Joe and his coworkers to fill their medications for chronic conditions via mail order. One of Joe’s doctor’s three medical assistants had faxed the doctor’s usual handwritten prescription for Paxil to the mail-order pharmacy.

The mail-order pharmacist misread the prescription as “Plendil,” a calcium channel blocker often used for the same purposes as beta blockers and commonly dosed at 10 milligrams per day but occasionally at 20 milligrams for patients with congestive heart failure.1 Joe had been dutifully taking the medication for the past few weeks, walking around with dangerously low blood pressure caused by high levels of the unneeded medicine. Joe’s depression had also been slowly, imperceptibly returning—hence his unusual appetite for alcohol, which lowered his low blood pressure even further, resulting in wooziness and cognition problems severe enough to render Joe vulnerable to the casino’s temptations.

In the ER, the metoprolol does the trick within minutes of entering Joe’s blood-stream: His blood pressure plummets, he goes into cardiac arrest, and he dies.

HIT market failure
. The underlying cause of Joe’s death is health information technology (HIT) market failure. If the state of U.S. medical technology is one of our great national treasures, then the state of U.S. HIT is one of our great national disgraces. We spend $1.6 trillion a year on health care—far more than we do on personal financial services—and yet we have a twenty-first-century financial information infrastructure and a nineteenth-century health information infrastructure. Given what is at stake, health care should be the most IT-enabled of all our industries, not one of the least. Nonetheless, the “technologies” used to collect, manage, and distribute most of our medical information remain the pen, paper, telephone, fax, and Post-It note.

Meanwhile, thousands of small organizations chew around the edges of the problem, spending hundreds of millions of dollars per year on proprietary clinical IT products that barely work and do not talk to each other. Health care organizations do not relish the problem, most vilify it, many are spending vast sums on proprietary products that do not coalesce into a systemwide solution, and the investment community has poured nearly a half-trillion dollars into failed HIT ventures that once claimed to be that solution. Nonetheless, no single health care organization or HIT venture has attained anything close to the critical mass necessary to effect such a fix.

This is the textbook definition of a market failure.
All but the most zealous free-market ideologues recognize that some markets simply do not work. Indeed, reasoned free-market champions often deconstruct specific market failures to elucidate normal market functioning. The most obvious examples of such failures (such as public transit and the arts) are subsidized by society at large because such subsidies yield benefits to the public that outweigh their costs. Economists refer to these net benefits as “positive externalities,” defined as effects that cannot be captured through the economic equation of direct cost and benefit.

The positive externalities of an HIT system approaching the functionality of our consumer finance IT system include reduction of medical errors like the one that killed Joe Wilson; elimination of tens of thousands of redundant and expensive tests, procedures, and medications, many of which are not only wasteful but harmful; and the coordination and consistency of medical care in ways only promised by the theoretical version of managed care. These public health benefits are well beyond the reach of a health care system characterized by the complexities of medicine and conflicts of multiple parties working at economic cross-purposes. They are trapped outside the economic equation, positive externalities of a stubbornly fee-for-service health care system that inadvertently rewards inefficiency, redundancy, excessive treatment, and rework...

Health Care’s Blue Screen Of Death: Reboot Or Reform?

The compulsion today is to find the elusive “business case” for health care IT. Legions of IT vendors and consulting companies have struggled to cobble together “the ROI” (consultantspeak for “return on investment”) to prove that an individual health care organization would benefit by investing in better IT and that the failure to date has been merely a cultural problem on the demand side (“the doctors won’t use computers”) or a sales problem on the supply side (“it’s all vaporware”). These objections are hardly sufficient to stop a force as revolutionary as IT. The practical reality is that the typical ROI is modest at best, ephemeral for most, and attainable only well past its investment horizon—a dressed-up way of saying that it exceeds the political capital of its current CEO and CIO. If there were a strong business case for a health care organization to break from the pack and build out a twenty-first-century IT system, we would have no need for this paper—or, for that matter, this entire issue of Health Affairs. If the health care IT market worked, it would have worked by now.

The ability of the gambling industry to liquidate Joe Wilson’s assets within minutes is an example of IT market success; the inability of the health care industry to catch a simple medical error during his half-day in the ER is an example of IT market failure. All parties involved in consumer financial transactions have an economic interest in seeing that those transactions work as smoothly as possible. Not so all parties involved in health care’s myriad transactions.

The business case for no HIT. The first step in understanding the real intractability of the problem is ignoring the rhetoric. There is a veritable cottage industry involving the articulation of moral outrage over the health care quality “crisis,” much of it public relations spadework for someone’s political or commercial ambition and most of it culminating in a the naïve insistence that the system is on the verge of collapse and cannot go on like this. Actually, it can and will go on like this forever, absent any major intervention by the nation’s largest health care purchaser—the U.S. government.

Why? Because in the crude fee-for-service (FFS) reimbursement system inherited by that purchaser in the 1960s and fundamentally unchanged since then, the Las Vegas hospital has little real interest in knowing Joe’s medical history. In most cases, access to such information would represent a reduction in billable services. In an industry rife with dirty little secrets, this is health care’s dirtiest: Bad quality is good for business. And the surest road to bad quality is bad or no information. The various IT systems out there are expensive to buy, implement, and train staff to use, but this expense pales in comparison to all of the pricey and billable complications those systems would prevent...

This article was published in the fall of 2005. Change a few NHE numbers and then- leading wonk names, it could have been written yesterday. Read it closely, all of it. It is excellent.

Now, we might also consider some thoughts stemming from "Property, Privacy and the Pursuit of Integrated Electronic Medical Records" (Hall, Turnage, and Turnage, PDF), another incisive piece.
...Even though e-health is growing steadily and will soon exist in some form just about everywhere, the electronic systems that are in place rarely interconnect -- a problem that is getting worse rather than better. The RAND Corporation summarizes that “the ability to share information from system to system is poor.” This is because there “is no market pressure to develop HIT systems that can talk to each other.” Instead, the “piecemeal implementation currently under way may actually create additional barriers to the development of a future standardized system because of the high costs of replacing or converting today’s non-standard systems.

The challenge is how to move an enterprise representing one-sixth of US GDP, with 13 million employees and potentially almost 300 million patients, from a decentralized, fragmented, paper based world, to an integrated, automated, networked world where information follows the patient, information-based tools can aid in decision making and quality, and population health data can be mined to improve the quality and outcome of care for all...

The authors note that ongoing legal uncertainties regarding "ownership" of patient-identifiable medical data (is it "property"? whose "property"?), coupled with strict federal and state-level privacy regulation of PHI (Protected Health Information) comprise significant potential barriers to effective, nationally seamless HIE (Health Information Exchange).
Who owns medical information? Patients, providers, both of the above, or no one? The law provides incomplete, unclear, and somewhat inconsistent answers ... property rights must be clearly established so that the respective parties know their legal default positions ... The relevant parties are in a quandary over who owns or controls what and so they do not know for sure what needs to be done to construct any particular information network model ... Accordingly, it matters a great deal to real-world actors who has exactly what rights in different aspects of medical record information.

Medical information has considerable commercial value. “[A] well-established multimillion-dollar business exists that utilizes secondary health data as its primary resource,” for purposes such as marketing to physicians or conducting medical research. Legal uncertainty or agnosticism over valuable property rights can spark a land grab that hoards rather than develops these productive assets. Once one party stakes its ownership claim, then so must all the other competing parties, for fear of being trumped. But, fencing off the terrain of medical information destroys the commons that might have supported valuable public goods. Witness the A.M.A.’s proclamation quoted above that physicians own the medical information they collect. Likewise, the Center for Studying Health System Change observed that hospitals’ greatest concern with I-EMRs is “losing competitive advantage by relinquishing control of ‘their’ data. They view[] clinical data as a key strategic asset, tying physicians and patients to their organization.”

Legal logjams also arise from privacy protections. Medical privacy is important, but we may be protecting it to a fault...

...From the economic perspective of investing in medical information, the lack of clear property rights plus the presence of strong privacy protections is the worst of both worlds. Privacy protections increase the costs of developing I-EMRs and uncertain property rights decrease the returns. How these barriers and uncertainties are resolved could determine the kinds of networks that will emerge and how efficiently they can form...

Hall et al point out that "facts" cannot be "owned." In a general, mundane sense, that is indeed the case -- not that it has ever stopped various commercial interests from trying to lay exclusive claim to all manner of public domain information. And, then, there's the complicating factor of myriad "facts" that go to "proprietary information" and "intellectual property," which are indeed "owned" and jealously guarded by their proprietors (not to mention the marketable "likenesses" of celebrities).

I have blue eyes. Who cares? I am 5'10" tall. Who cares? No one -- including me -- can really profit from knowing those facts. I weigh 170 lbs -- which makes my BMI (Body Mass Index) 24.4. Suddenly, things are a bit different, and if I weighed 50 lbs more (220) my BMI would be 31.6 (officially "high"), which would perhaps be of commercial partisan interest to [1] people trying to sell me weight loss products and services, [2] employers wanting to minimize their health care benefits cost risks, and, relatedly, [3] insurance companies (life or health) looking for an excuse to raise my rates or exclude me from coverage.

We pretty much all understand the potential adverse implications of trafficking in peoples' personal data, ranging all the way from targeted marketing to all other manner of gumshoeing and discrimination (regarding which your EHR/HIE clinical information comprise a potential treasure trove) to financial and identity theft.

(Courthouse News, June 20, 2011)
(CN) - Federal medical-privacy laws do not preempt California's own rules against doctors disclosing patient information to debt collectors, the state Supreme Court ruled.

The unanimous court revived Robert Brown's longtime crusade to hold dentist Rolf Reinholds accountable under California's Confidentiality of Medical Information Act.

Way back in 2000, Reinholds billed Brown $600 for a permanent dental crown that Brown claimed he never received. Brown refused to pay the bill, and Reinholds sent the matter to a bill collector and a credit agency, along with a copy of Brown's dental charts and the charts of Brown's minor children.

Debt-collector Stewart Mortenson in turn disclosed the Browns' confidential medical information to the consumer-reporting agencies Experian, Equifax and Trans Union, and also sent along the family's Social Security numbers, dates of birth, addresses, telephone numbers and entire dental histories.

After two years of repeated, unsuccessful demands that Mortensen stop the unauthorized disclosures, Brown sued Reinholds and Mortensen, alleging violations of the Confidentiality Act, which generally prohibits the unauthorized dissemination of medical information.

The trial court found Brown's claims too vague and dismissed them. The Court of Appeal affirmed, though for different reasons. It ruled that Brown's state law claims were preempted by the federal Fair Credit Reporting Act (FCRA).

Not so, the state's high court said Thursday.

Congress chose not address "the scope of a medical provider's duties when furnishing information to a consumer reporting agency" in the FCRA, the court found. And while Congress did address the issue somewhat in the Health Insurance Portability and Accountability Act (HIPAA), it simultaneously allowed "states to continue to regulate to the extent they desired to enact more stringent, privacy-favoring legislation," according to the ruling...

HIPAA is unique in that it reverses the otherwise default legal custom of "federal preemption" -- i.e., where state law and regulation conflict with their explicit federal counterparts, the latter typically prevail. With respect to HIPAA, state law/regulation found to be "more stringent" in the protection of PHI trumps HIPAA regulation.

A recent example from Ohio:

Privacy & Information Security Law Blog
Posted at 8:47 AM on May 7, 2010 by Hunton & Williams LLP

State Law Trumps HIPAA in Suit Over Disclosure of Medical Records

Rejecting a defense based on compliance with the federal Health Insurance Portability and Accountability Act of 1996 (“HIPAA”), a federal court in Ohio denied a medical clinic’s motion to dismiss invasion of privacy claims following the clinic’s disclosure of medical records to a grand jury. In Turk v. Oiler, No. 09-CV-381 (N.D. Ohio Feb. 1, 2010), plaintiff Turk had been under investigation for illegally carrying a concealed weapon and for having a weapon while under disability in violation of an Ohio law which provides that “no person shall knowingly acquire, have, carry, or use any firearm” if “[t]he person is drug dependent, in danger of drug dependence, or a chronic alcoholic.” Defendant Cleveland Clinic, where Turk was a patient, received a grand jury subpoena requesting “medical records to include but not be limited to drug and alcohol counseling and mental issues regarding James G. Turk.” When the Cleveland Clinic disclosed Turk’s medical records in response to this subpoena, Turk sued the clinic for violating his privacy rights.

In its defense, the clinic argued that a specific exemption in HIPAA permits such disclosure of medical records in response to a grand jury subpoena. Ohio’s physician-patient privilege, however, provides that a physician cannot testify as to “a communication made to the physician . . . by a patient in that relation or the physician’s . . . advice to a patient.” The court found that the term “communication,” as used in the statute, includes hospital records “and is sufficiently broad to cover any confidential information gathered or recorded within them during the treatment of a patient at the hospital.” Because the HIPAA provision exempting the disclosure would not preempt this more restrictive state law, the court denied the clinic’s motion and refused to dismiss Turk’s privacy claim. That decision may have prompted a settlement, as this week, the court granted a request by Turk to dismiss all of his claims against the clinic.
I could go on. The more I look, the more I find.

Lately I've been skulking around down in the bowels of various state laws and regulations pertaining to health information privacy. One good launch pad has been the Georgetown University Center on Medical Rights and Privacy.

A few random snips from their in-depth publications on state laws (Volume 1, Alabama-Montana, PDF; Volume 2, Nebraska-Wyoming, PDF):
New Hampshire: The medical information contained in the medical records in the possession of any health care provider is the property of the patient. [N.H. Rev. Stat. § 332-I:1.]

Medical information contained in the client’s record of a home health care provider is deemed to be the client’s property and the client has the right to a copy of these records upon request and at a reasonable cost. [N.H. Rev. Stat. § 151:21-b(II)(i).]

The medical information contained in the medical records at any licensed health care facility is the property of the patient. [N.H. Rev. Stat. § 151.21(X).]

Virginia: Although patient records are the property of the provider maintaining them, Virginia recognizes a patient’s right of privacy in the content of his medical records. [See Va. Code Ann. § 32.1-127.1:03(A).]

Medical records maintained by health care providers are the property of the provider, or the employer if the health care provider is employed by another health care provider. [Va. Code Ann. § 54.1-2403.3.] Patient records may not be transferred with the sale of a professional practice until an attempt is first made to notify patients of the pending transfer, by mail, at the patient’s last known address, and by publishing prior notice in a newspaper of general circulation within the provider’s practice area. The notice must inform patients that at their written request, within a reasonable time, records or copies can be sent to another like-regulated provider of the patient’s choice or destroyed. [Va. Code Ann. § 54.1-2405.]

Florida: The results of the DNA analysis,whether held by a public or private entity, are the exclusive property of the person tested, are confidential, and may not be disclosed without the consent of the person tested. [Fla. Stat. Ann. § 760.40.]

All “records owners,” i.e., any health care practitioner who generates a medical record, receives medical records from a previous record owner, or the practitioner’s employer, if the employer is designated as the records owner, [Fla. Stat. Ann. § 456.057(1) (defining “records owner.”)] are required to develop and implement policies, standards and procedures to protect the confidentiality and security of medical records.

Louisiana: Medical records of a patient maintained in a health care provider’s office are the property and business records of the health care provider. [La. Rev. Stat. Ann. § 40:1299.96(A)(2)(b).]

An insured’s or enrollee’s genetic information is the property of the insured or enrollee.**

Mississippi: Hospital records are the property of the hospitals. [Miss. Code Ann. § 41-9-65.]

Indiana: Providers are the owners of original health care records and they may use these records without the specific written authorization of the patient for legitimate business purposes, including submission of claims for payment from third parties; collection of accounts; litigation defense; quality assurance; peer review; and scientific, statistical and educational purposes. [Ind. Code Ann. § 16-39-5-3.]

The provider is the owner of the mental health record and is entitled to retain possession of it. [Ind. Code Ann. § 16-39-2-2.]

South Carolina: Under the Physicians’ Patient Records Act the physician is the owner of the medical record. [S.C. Code Ann. §§ 44-115-20.]

According to this resource, only New Hampshire unequivocally confers full data "ownership" on patients, though, I find the thus far couple of "genetic information" property entitlements interesting as well (e.g., FL and LA).

Privacy and Security Solutions for Interoperable Health Information Exchange
Report on State Medical Record Access Laws
(August 2009)

A brief excerpt...
State Laws
States use varying terms to describe the health information encompassed by individuals’ right of access, including, for example, patient records, health records, medical records, hospital records, and patient information. In many states, these terms are undefined [see, e.g., W. VA. Code § 16-29-1 (2008) (where state law gives individuals the right of access to all or a portion of the “patient’s record,” a term which is not defined in the statute or regulations)]. However, provisions in several states expressly define the relevant term in detail, specifically including in some instances medical records or information created by others [see, e.g., N.H. Code Admin. R. Ann. Med 501.02(f)(2) (2008)].

Challenges for an Electronic Environment
The fact that states use varying terms (or fail) to define health information that is subject to a right of access may prove problematic. One issue is whether the medical records or health information subject to the individual’s right of access includes material in the record that came from another source. Some health care providers apparently interpret access to medical records or health information as encompassing only information that was generated within their office or facility. In responding to an individual’s request for copies of medical records, some health care providers exclude any information in their possession that was obtained from other health care providers. While some state law provisions clearly define medical record access as including information furnished by other health care providers, most state laws governing doctors and hospitals do not expressly address this issue. The ambiguity in law on this issue, i.e., whether these health care providers must provide access to health information regardless of the originating source, may continue to prove problematic in an electronic environment where any particular health care provider likely will maintain data that originated from myriad sources.

Lots of potential digital PHI jurisdictional issues remain with respect to medical data "ownership," access, and disclosures. All of the state statutes and regulations I have examined thus far simply refer to the rights of "patients." Nothing about in-state vs out-of-state legal "residents" (indeed, the term "resident" only appears in reference to patients in long-term care settings or psych facilities). So, for example, say you are a legal resident of a state whose PHI law is "more stringent" than HIPAA (and thus nominally trumps HIPAA) and are treated in a facility in another state where HIPAA is the default, which law governs access to (and release of) your PHI? One might reflexively think it'd be the jurisdiction wherein the medical encounter occurred, but it might in fact not be all that clear.

More work for lawyers, 'eh? To wit...
HIPAA May Provide Basis for State Law Private Cause of Action
[McGuire Woods, LLP, 06/23/11]
The Health Insurance Portability and Accountability Act (HIPAA) imposes requirements on healthcare entities involved in the exchange of health information to protect the confidentiality of such information. It provides both civil and criminal penalties for individuals who improperly handle or disclose individually identifiable health information. HIPAA does not create a private right of action, under federal law. However, a recent decision by a district court in Missouri held that HIPAA may form a basis of a state law “negligence per se” claim.

In I.S. v. Washington University, E.D. Mo., No. 11-235, 6/14/11, the U.S. District Court for the Eastern District of Missouri, refused to dismiss plaintiff’s claim for negligence per se, despite its reliance on HIPAA, and remanded the case to state court. In this case, plaintiff alleged that defendant made an unauthorized release of certain medical records to plaintiff’s employer, which resulted in harm to the patient. Under Missouri law, the elements of a claim for “negligence per se” are: 1) a violation of a statute; 2) the injured plaintiff was a member of the class of persons intended to be protected by the statute; 3) the injury complained of was of the kind the statute was designed to protect; and 4) the violation of the statute was the proximate cause of injury.

In asserting negligence per se, the plaintiff relied solely on HIPAA to meet the required elements of the claim. Defendant moved to dismiss this claim in federal court on the basis that HIPAA does not create a private cause of action. However, plaintiff contended that its reference to HIPAA in its negligence per se action was merely to establish the legal duty of care rather than a means to find a private cause of action under HIPAA, and that the case should be remanded to state court as it is not a matter of federal subject matter jurisdiction. Ultimately, the court agreed and declined to dismiss the negligence per se claim, although it did remand the case to state court.

The Washington University case is not the first case to hold that HIPAA may be referenced as a basis for a state law claim. For example, in Acosta v. Byrum, 638 S.E. 2d. 246, 253 (N.C. Ct. App. 2006), the North Carolina Court of Appeals allowed a plaintiff to make an intentional infliction of emotional distress claim against a psychiatrist by relying on HIPAA. In that case, the psychiatrist allegedly allowed an office manager to have access to medical records that were used to cause harm to the patient. The plaintiff used HIPAA to establish the standard of care element required in a claim for negligence. The trial court dismissed the claim stating that HIPAA does not create a private cause of action. However, the appeals court reversed, not because HIPAA creates a private cause of action, but because the court found it appropriate to use HIPAA as establishing a standard of care in making claims that the defendant violated a standard of care.

The cases above illustrate the interplay between HIPAA and state law and open the doors to future lawsuits where plaintiffs use HIPAA as a basis for private claims. The risks of such private causes of action are only expected to increase, particularly with the expanded duties that will be laid out in the forthcoming final regulations to HIPAA, which are being modified by the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act. These final regulations will contain provisions that update HIPAA and extend yet-to-be-finalized health data privacy and security rules to healthcare entities, including funding for heightened HIPAA enforcement...


How a Broken Medical System Killed Google Health
Google would have had to fix a balkanized U.S. health-care system to make the service catch on.
WEDNESDAY, JUNE 29, 2011 BY DAVID TALBOT, MIT Technology Review
At the end of this year, Google Health will flatline. The service couldn't encourage many people to import or analyze their health data, and experts say its untimely death is, in many ways, an extension of U.S. health-care providers' failure to share data across institutions, or make it easy for patients to obtain it.

Google's free online service lets people upload, store, analyze, and share their health information. But there are hundreds of different health-care institutions in the U.S. that use different systems to record and store data, and many doctors don't use electronic records at all, making the task of retrieving and updating data extremely difficult for the average person, says Isaac Kohane, who directs the informatics program at Children's Hospital in Boston, and codirects Harvard Medical School's Center for Biomedical Informatics.

For Google to make its service attractive, it would have had to solve this health IT mess, which is in the early stages of being addressed through recent national policy moves. These include 2009 federal stimulus incentives for doctors and hospitals to adopt electronic medical records, and for hospitals to share data with one another.

Kohane says it will be at least five years before data flows smoothly enough to make something like Google Health worthwhile. "Google is unwilling, for perfectly good business reasons, to engage in block-by-block market solutions to health-care institutions one by one," Kohane says, "and expecting patients to actually do data entry is not a scalable and workable solution."...


Saturday, June 4, 2011

Meaningful Use ePHI Privacy and Security compliance criteria

CORE MEASURE OBJECTIVE: Protect electronic health information (ePHI) created or maintained by the certified EHR technology through the implementation of appropriate technical capabilities.

ASSOCIATED MEASURE: Conduct or review a security risk analysis per 45 CFR 164.308(a)(1) of the certified EHR technology, and implement security updates and correct identified security deficiencies as part of its risk management process.

METHOD OF MEASURE CALCULATION: Measure requires only a Yes/No attestation.

THRESHOLD: Conduct one security risk assessment.

: None.

I recently accompanied on of my REC colleagues on a "Meaningful Use training session" at one of our outpatient solo doc client sites. The vendor support rep (I won't name the product) blithely made a couple of serious misstatements of fact as he walked the physician through the core and menu set criteria. First, he stated in error that the doctor could simply export a patient CCD/CCR document (basically an XML dump of a patient record) to a PDF file and send it on to another provider as an email attachment in order to satisfy the "perform at least one test of certified EHR technology's capacity to electronically exchange key clinical information" criterion.

I had to point out that any live patient data would have to first be appropriately encrypted prior to any transmission. Failure to do so would constitute a HIPAA violation, the sanctions for which are now quite severe.

His second errant assertion was equally misinformed. He stated that the doctor was automatically in compliance with the "
Protect electronic health information" criterion cited at the outset of this post -- simply by virtue of using the vendor's ONC CHPL certified release of this particular EHR platform.

Nothing could be further from the truth. He was irrelevantly referring to the NIST EHR certification standards comprising 45 CFR §170.302(o) through §170.302(w):
  • Access control;
  • Emergency access;
  • Automatic log-off;
  • Audit log;
  • Integrity;
  • Authentication;
  • General encryption;
  • Encryption when exchanging electronic health information;
  • Accounting of disclosures (optional),
all of which simply address the functional capability of an EHR platform to comply with the foregoing.

And, none which has anything to do with the Meaningful Use core requirement to "
conduct or review a security risk analysis" during the Attestation period.


To be useful for significant and sustained improvements in individual and population health, health care data must be available in timely fashion to those with the requisite authority (hence the security requirement) and need to access and act upon them. Moreover, in that regard we must also be assured of the integrity of the data – i.e., we must have ongoing confidence that the data transmitted electronically (especially those with direct bearing on clinical decision-making) are exactly** those acquired and stored at the source.
** It is tangentially noteworthy that the Meaningful Use ePHI standard does not address source data “accuracy” per se (i.e., the GIGO risk – “Garbage In, Garbage Out”), regarding which the venue for patient redress (there being no affirmative, proactive provider onus) is accorded at 45 CFR 164.526: Amendment of protected health information. (a) Standard: Right to amend. (1) Right to amend. An individual has the right to have a covered entity amend protected health information or a record about the individual in a designated record set for as long as the protected health information is maintained in the designated record set. This is essentially akin to consumers’ remedies for inaccuracies that come to light in credit reporting agencies’ data.

This Meaningful Use (MU) criterion explicitly focuses on three ePHI areas:
  1. Access;
  2. Storage, and;
  3. Transmission.
And, with respect to the foregoing, the Stage 1 MU compliance standard requires attestation of documentable demonstration of implementation of reasonable and sufficient
  1. administrative safeguards;
  2. technical safeguards, and;
  3. physical safeguards
of ePHI in order to ensure that the potentially conflicting ends of patient data availability, security, and integrity are consistently resolved in documentable fashion.

Returning to the language of this Meaningful Use ePHI compliance specification:
ASSOCIATED MEASURE: Conduct or review a security risk analysis per 45 CFR 164.308(a)(1) of the certified EHR technology, and implement security updates and correct identified security deficiencies as part of its risk management process.
And then from the cited “per 45 CFR” itself,
§ 164.308 Administrative safeguards.

(a) A covered entity must, in accordance with §164.306:
(1)(i) Standard: Security management process. Implement policies and procedures to prevent, detect, contain, and correct security violations.
We ought make note that “accordance with §164.306” obligates the attestee to having adequately addressed all of the subsectional provisions of 45 CFR 164.3 (the breadth of “Subpart C,”):
§164.306(c) Standards. A covered entity must comply with the standards as provided in this section and in §164.308, §164.310, §164.312, §164.314, and §164.316 with respect to all electronic protected health information.
These mandate, in the aggregate, effective [1] administrative safeguards, [2] technical safeguards, and [3] physical safeguards – the gamut. Moreover, activities undertaken pursuant to compliance with this criterion must be sufficiently documented:
§ 164.316 Policies and procedures and documentation requirements.

A covered entity must, in accordance with §164.306:

(a) Standard: Policies and procedures. Implement reasonable and appropriate policies and procedures to comply with the standards, implementation specifications, or other requirements of this subpart, taking into account those factors specified in §164.306(b)(2)(i), (ii), (iii), and (iv). This standard is not to be construed to permit or excuse an action that violates any other standard, implementation specification, or other requirements of this subpart. A covered entity may change its policies and procedures at any time, provided that the changes are documented and are implemented in accordance with this subpart.
There is a six-year records retention requirement associated with the measure [§164.316(2)(i)]. Also noteworthy is that, while the various compliance specifications are denoted as either “required” or “addressable” (highlighted in the Subpart C appendix below), the latter are not to be construed as “optional.” While the measure speaks to “flexibility” and “reasonableness” with regard to implementation latitude, such are not to be construed as license for a la carte non-documentation.

See, e.g., §164.306(d)(1) through (B)(2)(e).

In light of recent national media attention regarding ePHI concerns, it is not advisable to take the foregoing lightly. For example:
HHS Inspector General reports highlight IT security gaps in health care

May 26, 2011,
Baker & Hostetler LLP

On May 16, the Office of Inspector General (OIG) of the Department of Health and Human Services (HHS) issued two reports critical of the government’s efforts to build and enforce a federal information security framework for protecting individuals’ electronic protected health information (ePHI). Of particular interest to health care providers and health plans, these reports signal that heightened enforcement efforts appear likely in the future, making information security a top priority when developing and operating interoperable health care information technology (HIT).

The first OIG report, which assessed the Centers for Medicare and Medicaid Services’ (CMS’) and Office of Civil Rights’ (OCR’s) oversight of the Security Standards under the Health Insurance Portability and Accountability Act of 1996 (HIPAA), found shortcomings in hospital information security implementation, and criticized a perceived lack of effective of oversight of such Security Standards by CMS and OCR.

The OIG audit examined information security systems at seven large hospitals located in several states. The report found 151 security vulnerabilities, ranging from insufficient password strength and unencrypted laptops containing ePHI, to lack of physical protections (e.g., locks) for computer storage rooms, inadequate encryption methods, and incomplete policies and procedures to address audit controls, backup plans and disaster contingencies. The majority of findings were rated as “high impact”, which means posing a significant risk of harm to the individuals whose ePHI was transmitted or stored in such facilities. The report concluded that the OCR needs to significantly improve oversight and enforcement of data security under HIPAA, including continuation of the compliance oversight reviews of covered entities begun in 2009 at the direction of CMS. The OIG report also referred to exercise of the specific HIPAA enforcement measures and larger penalties enacted under the 2009 American Recovery and Reinvestment Act’s Health Information Technology for Economic and Clinical Health Act (HITECH) provisions.

The second OIG report criticized the Office of the National Coordinator for Health Information Technology (ONC), the agency created under ARRA/HITECH to administer and oversee federal incentives for the adoption and meaningful use of interoperable electronic health records (EHRs), and other related national HIT initiatives. That report found that the ONC failed to incorporate general information security requirements in the measures required for certified EHRs under HITECH. While certain application security controls were included in the HIT standards, the OIG found that general security requirements for the overall security structure, policies and procedures to be specifically applied to EHR systems were lacking.

In light of these OIG reports, and of ongoing news of misappropriation of patients’ health information and wide-scale security breaches, health care providers and health plans should consider reassessing their security risk exposure and preparedness to address information security lapses and HIPAA enforcement likely to be at the forefront of the national HIT trend.

More recent news:

[, May 26th] Despite spending a lot of time making sure they are compliant with federal and state regulations, health care organizations claim they are still seeing a lot of data breaches.

Being regulatory-compliant does not necessarily reduce the chances of a data breach, at least for the health care industry, according to a new study. Even more worrisome, organizations appear to be focusing more on compliance and less on security.

About 56 percent of IT security professionals in the health care industry said they spend the majority of their time addressing compliance requirements, according to the results of a GlobalSign survey released May 26. Even so, 34 percent of the health care industry IT security professionals polled said their organizations experienced a patient-records data breach within the past two years.

The survey “validates” the fact that health care organizations are putting in the effort to comply with HIPAA (the Health Insurance Portability and Accountability Act), the HITECH (Health Information Technology for Economic and Clinical Health) Act, and other state and federal regulations, according to Lila Kee, chief product officer at GlobalSign. However, it also demonstrated that “checking the boxes on compliance audits” will not ensure security or privacy when it comes to sensitive data, Kee said...

I visited with another solo practice physician recently for an initial REC assessment. When I asked about written HIPAA-compliant policies and procedures, I got that too-familiar deer-in-the-headlights look. With all the recent contentious ePHI news, I don't think this is a particularly good time for providers to be ignoring or paying lip service to the breadth of HIT privacy and security concerns reflected in the foregoing Meaningful Use criterion.


We at the REC have come to simply reiterate the phrase "privacy and security."

But, "privacy" per se is addressed specifically at 45 CFR 164.5nn --
Subpart E—Privacy of Individually Identifiable Health Information, which is about things such as
  • patient data access;
  • consents;
  • disclosure;
  • right of correction of erroneous PHI;
  • records retention requirements.
The Meaningful Use measure recounted at the top of this post is really about ePHI "security" -- the "privacy" part appears to exceed our portfolio.

But, without PHI security there can be no patient "privacy."


Mucking around in SmartDraw one recent afternoon. Click to enlarge.