Search the KHIT Blog

Saturday, March 28, 2015

"The only person who enjoys change is a baby with a wet diaper."

I first heard that quip, offered up by Brent James, MD, nearly 21 years ago during Healthcare QI training at Intermountain Health Care. So true.

i.e., "don't just do something; stand there!"

Good stuff, as always, from Dr. Aaron Carroll.

Friday, March 27, 2015

Fear of HIT? "Get over it!"

The medical profession needs to get over its fear of information technology

Continued objections to Electronic Health Records (EHR) by sections of the physician community are bogus. They arise from past entitlements and a lack of accountability.

Read on...

That will not go unchallenged, no doubt vitriolically so in many quarters.

But, on one point,
Quality of treatment can improve significantly: When a complete medical record is available about a patient, including details of visits to multiple healthcare professionals, the quality of diagnosis and hence treatment decisions should improve greatly. This improves patient safety and reduces medical errors, since everyone has access to the same set of data.
I don't see how anyone can rationally disagree with that, in general. Of course, once we get past what I irascibly call "Interoperababble."

Now, "fear of information technology" is surely a deliberately inflammatory bit of clickbait headline work. More likely is "antipathy" borne of feeling put-upon by clinically ignorant Health IT policy geeks and regulators more broadly.

When I'm looking for astute information and analysis on the issues, I always turn to Dr. Jerome Carter's "EHR Science" blog. to wit:
Understanding How Physicians Use EHR Systems

Jerome Carter, MD, March 23, 2015 
Technology usage patterns differ from person to person. Any number of factors could account for the differences such as varying needs for specific features, lack of familiarity, or being unaware of available functions. Two recent studies that looked at physicians’ EHR use may offer a much more nuanced and interesting take on this topic.
Like anyone who has practiced in a setting that used paper charts, I noticed specific information management patterns among my colleagues. Some kept meticulous notes that read like novels, while others created notes that were so sparse they were nearly unusable.   The same held true for things like preventive maintenance. Some charts had documentation that was clear and precise while in other charts the only way to tell if a mammogram had been done was by slogging through radiology reports.   Why such a wide variation? Do disparities in recordkeeping have anything to say about the quality of care rendered?...
...I was not surprised that those who valued information kept better records and used EHR systems to a greater degree. I wonder how useful this information is for understanding clinical care issues such as diagnostic errors and results management. Does uncertainty absorption coupled with low EHR use point to a lower level of vigilance and lower quality care?...

Some people will find this merely further evidence of the prevailing anti-clinical workflow priority. A long-standing complaint regarding EHRs is that they are primarily billing platforms. e.g., Dr. Carter:
EHR system-clinical work impedance
The automation of clinical care with current EHR systems has resulted in numerous complaints from clinical professionals who are fed up and discouraged by systems that make their jobs harder to do. The number of workflow disruptions that occur as a result of EHR use should surprise no one. Disruptions were to be expected because EHR systems are archival systems that do not contain models of clinical work. Making matters worse is the fact that EHR systems have their own internal workflows.   Consequently, a good portion of EHR training is spent helping EHR users learn to adapt their workflows to those of the software. Thus, training times are one hint of impending EHR system-clinical work impedance and attendant clinician misery.
From Health Affairs:
The Final Stage Of Meaningful Use Rules: Will EHRs Finally Pay Off?
by Ashish Jha

Six years ago, President Obama signed into law the HITECH Act, which spelled out a path to a nationwide health information technology infrastructure. The goal was simple:  every doctor, nurse, and hospital in America should use electronic health records — and do it in a way that leads to better care delivered more efficiently. The Act provided $30 billion in incentives for providers and hospitals who met the criteria for “Meaningful Use”, which the Obama administration was given the authority to define. The rules were set up to be rolled out in three stages, and while the first two stages have been out for a while, the criteria for the third and final stage of Meaningful Use (MU) were finally released on March 20...

Opening up closed EHR systems...

The big deal in the stage 3 meaningful use rules is data flow.  Here, I think federal policymakers are helping to fix the big problems with EHRs, though they could go further.  The current EHR vendors have prioritized integration with legacy systems and complex, secure systems over ease of use and support for better care.  That’s a problem.  Most of these systems are closed, making it difficult to use 3rd party vendors to improve provider experience or share data with others...

However, by forcing EHRs to allow for sharing of data with patients, and by pushing EHRs to incorporate patient-generated data, the new proposed rule will begin to create leaks in these closed systems. And that’s a helpful start. As the data in the EHR begins to be able to break free, third party vendors will build better tools that engage patients in their care.  Requiring EHRs to incorporate data generated by patients will push the industry towards greater standardization.

….but not quickly enough.

While these are helpful steps, they may not be enough.  If we are serious about addressing EHR’s poor usability and inability to support the kind of care we are increasingly demanding, then we need to open up the EHR systems in a more robust way.  As part of certification, the Office of the National Coordinator could require that all EHRs publish their full application-program interfaces (APIs). The proposed rule begins to do that, but only as it relates to sharing information with patients.  This is not enough. ONC should require that any vendor that enjoys federal subsidies for its products make its full suite of APIs widely available for third party products.

This may sound like a technical issue, but it’s a critically important one. If these APIs become widely available, third party vendors will build the tools that currently limit EHR utility and value. Hate the way your EHR does clinical documentation?  Use the one just developed by a new vendor down the street. That kind of competition will make everyone better.

If you were locked into using the Apple map forever, they would have little incentive to improve it. That’s how the world works – and to improve EHRs, we need the kind of competitive pressure created by open ecosystems. Stage 3 meaningful use rules move us one step towards that goal, and that’s a good thing. But given how long the journey is between EHR adoption and better care, we could surely move faster...
[The] Office of the National Coordinator could require that all EHRs publish their full application-program interfaces (APIs). The proposed rule begins to do that, but only as it relates to sharing information with patients.  This is not enough. ONC should require that any vendor that enjoys federal subsidies for its products make its full suite of APIs widely available for third party products.
That's about as close as we'll get to my "Data Dictionary Standard" argument. Maybe it will suffice.

More to come...

Tuesday, March 24, 2015

"We value the trust and confidence that you place with BCBSRI..."

Yeah, sure you do. But, not enough to secure our personal health-related information.

A form letter they sent my wife in the wake of the recent Anthem breach:
February 27, 2015

Dear current or former plan participant,

I’m writing with an important update about the security of your personal information.

Anthem, Inc. (Anthem), which owns independent Blue Cross and Blue Shield companies, was recently the victim of a cyber attack where current and former health plan participant information was illegally accessed. Unfortunately, and thumbs investigation indicates some of your personal information has likely been affected. In the coming weeks you can expect to hear from anthem with information on the protections they are providing, including identity repair services (no enrollment is required) and credit monitoring.

Why does anthem have my information?

In the past 10 years, you likely received healthcare services in an area served by anthem's Blue Cross Blue Shield companies. 37 independent, locally operated companies across the United States — including anthem and Blue Cross and Blue Shield of Rhode Island (BCBSRI) — form the Blue Cross Blue Shield system this enables Blue Cross Blue Shield customers to get high quality, affordable health care wherever they are.

What information may be affected?

Please know that the information affected did not include credit card or medical information. However, it may include any of the following data (which may date back to 2004):

  • Name
  • Date of birth
  • Blue Cross member ID number
  • Mailing address
  • Email address 
We’re here for you.

We value the trust and confidence that you place with BCBSRI and blue plans across the country, and we are deeply sorry for the situation. To help you learn more we’ve included some frequently asked questions on the reverse side of this letter...


Melissa Cummings
Senior Vice President and Chief Customer Officer,
(401) 459-5756,

We have never directly been Anthem customers, But, we get our health coverage through my wife's employer via Blue Cross/Blue Shield of Rhode Island.

So much for "independent companies." If parent company Anthem has no need of our information for "treatment, payment, or health care operations" (the Covered Entity criteria, which, in this case, devolves to BCBSRI specifically), what the hell are they doing with our data? And, are only the data they list as "may be included" in fact the extent of the filched information?

I'm not going to take that assertion at face value.

We know what they're doing with peoples' data originating in subsidiary entities. It's called Big Data Analytics "business intelligence."

"We value the trust and confidence you place with BCBSRI..." What crap. I had no choice in the matter. The "independent" Anthem subsidiary is what is offered at my wife's company. Take it or leave it.

I've posted this everywhere. A LinkedIn comment exchange:

The mounting toll of all of the reports of trafficking in others' personal data -- from cavalierly legal to brazenly illicit -- is starting to wear on me. Recall my recent post "Your use of the Services constitutes your agreement to the Privacy Policy." Tangentially apropos, see also my post "(404)^n, the upshot of dirty data."




Anthem, the second largest health insurance provider in the United States, revealed [recently ] that its records have been compromised by hackers—resulting in the possible leaking of names, birthdays, addresses, Social Security numbers, and employment data for up to 80 million present and former customers.

Although no medical information appears have been stolen, with the exception of customers’ medical identification numbers, the attack is being viewed as a much-needed wake-up call for the heath industry.

"Cybercriminals do view health care organizations as a soft target," says Lynne Dunbrack, research VP for IDC Health Insights. "They classically have not invested too heavily in information-technology in general, and specifically in security. Going hand in hand with that is the value of medical information on the black market, which has long since exceeded the value of personal identifiers for financial data. To give you an idea, financial records may fetch just a couple of dollars, whereas medical information routinely sells for $200. That’s a real incentive for cybercriminals."

Worries about the security of health care data is a growing issue—accompanying the increasing digitization of medical records, combined with the still more recent shift toward cloud-based record holding. Anthem's mess is far from the only recent example of troubling privacy concerns regarding health data. In 2010, the Coalition Against Insurance Fraud reported that 1.4 million Americans were victims of medical identify theft, representing a significant increase from the 500,000 one year earlier.

Stolen medical data can be particularly problematic for consumers. Whereas credit-card fraud may be corrected in a relatively straightforward manner, it can be tougher to identify that medical data has been breached. Maximum insurance payout limits may be reached as a result of fraudulent claims, and this might only be discovered when a consumer's claims for legitimate services are denied.

Worse, consumers' medical records could become compromised with falsified diagnoses or procedure codes following data-theft incidents. In a worst-case scenario, vital information related to allergies or blood type could be compromised, with the wrong drugs or blood products administered to a patient as a result.

Others have expressed concern about what appears to be the misuse of private medical data. Last month, Ricardo Alonso-Zaldivar and Jack Gillum of the Associated Press reported that has shared user data—possibly including information about age, income, and whether or not a person is pregnant—with tech companies such as Google, Twitter, and Facebook. Although there is no evidence that this data has been misused, it is still likely that this will irk some individuals....
apropos, my latest read:

Excerpting just the cites going to medical data:
We’re starting to collect and analyze data about our bodies as a means of improving our health and well-being. If you wear a fitness tracking device like Fitbit or Jawbone, it collects information about your movements awake and asleep, and uses that to analyze both your exercise and sleep habits. It can determine when you’re having sex. Give the device more information about yourself— how much you weigh, what you eat— and you can learn even more. All of this data you share is available online, of course.

Many medical devices are starting to be Internet-enabled, collecting and reporting a variety of biometric data. There are already— or will be soon— devices that continually measure our vital signs, our moods, and our brain activity. It’s not just specialized devices; current smartphones have some pretty sensitive motion sensors. As the price of DNA sequencing continues to drop, more of us are signing up to generate and analyze our own genetic data. Companies like 23andMe hope to use genomic data from their customers to find genes associated with disease, leading to new and highly profitable cures. They’re also talking about personalized marketing, and insurance companies may someday buy their data to make business decisions.

Schneier, Bruce (2015-03-02). Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (p. 16). W. W. Norton & Company. Kindle Edition.

The general practice of amassing and saving all kinds of data is called “big data,” and the science and engineering of extracting useful information from it is called “data mining.” Companies like Target mine data to focus their advertising. Barack Obama mined data extensively in his 2008 and 2012 presidential campaigns for the same purpose. Auto companies mine the data from your car to design better cars; municipalities mine data from roadside sensors to understand driving conditions. Our genetic data is mined for all sorts of medical research. Companies like Facebook and Twitter mine our data for advertising purposes, and have allowed academics to mine their data for social research.

Most of these are secondary uses of the data. That is, they are not the reason the data was collected in the first place. In fact, that’s the basic promise of big data: save everything you can, and someday you’ll be able to figure out some use for it all.

Big data sets derive value, in part, from the inferences that can be made from them. Some of these are obvious. If you have someone’s detailed location data over the course of a year, you can infer what his favorite restaurants are. If you have the list of people he calls and e-mails, you can infer who his friends are. If you have the list of Internet sites he visits— or maybe a list of books he’s purchased— you can infer his interests.

Some inferences are more subtle. A list of someone’s grocery purchases might imply her ethnicity. Or her age and gender, and possibly religion. Or her medical history and drinking habits. Marketers are constantly looking for patterns that indicate someone is about to do something expensive, like get married, go on vacation, buy a home, have a child, and so on. Police in various countries use these patterns as evidence, either in a court or in secret. Facebook can predict race, personality, sexual orientation, political ideology, relationship status, and drug use on the basis of Like clicks alone. The company knows you’re engaged before you announce it, and gay before you come out— and its postings may reveal that to other people without your knowledge or permission. Depending on the country you live in, that could merely be a major personal embarrassment— or it could get you killed.

There are a lot of errors in these inferences, as all of us who’ve seen Internet ads that are only vaguely interesting can attest. But when the ads are on track, they can be eerily creepy— and we often don’t like it. It’s one thing to see ads for hemorrhoid suppositories or services to help you find a girlfriend on television, where we know they’re being seen by everyone. But when we know they’re targeted at us specifically, based on what we’ve posted or liked on the Internet, it can feel much more invasive. This makes for an interesting tension: data we’re willing to share can imply conclusions that we don’t want to share. Many of us are happy to tell Target our buying patterns for discounts and notifications of new products we might like to buy, but most of us don’t want Target to figure out that we’re pregnant. We also don’t want the large data thefts and fraud that inevitably accompany these large databases. [ibid, pp. 33-34]

Once you can correlate different data sets, there is a lot you can do with them. Imagine building up a picture of someone’s health without ever looking at his patient records. Credit card records and supermarket affinity cards reveal what food and alcohol he buys, which restaurants he eats at, whether he has a gym membership, and what nonprescription items he buys at a pharmacy. His phone reveals how often he goes to that gym, and his activity tracker reveals his activity level when he’s there. Data from websites reveal what medical terms he’s searched on. This is how a company like ExactData can sell lists of people who date online, people who gamble, and people who suffer from anxiety, incontinence, or erectile dysfunction. [ibid, pp. 41-42]

Google, with its database of users’ Internet searches, could de-anonymize a public database of Internet purchases, or zero in on searches of medical terms to de-anonymize a public health database. Merchants who maintain detailed customer and purchase information could use their data to partially de-anonymize any large search engine’s search data. A data broker holding databases of several companies might be able to de-anonymize most of the records in those databases.

Researchers have been able to identify people from their anonymous DNA by comparing the data with information from genealogy sites and other sources. Even something like Alfred Kinsey’s sex research data from the 1930s and 1940s isn’t safe. Kinsey took great pains to preserve the anonymity of his subjects, but in 2013, researcher Raquel Hill was able to identify 97% of them.

It’s counterintuitive, but it takes less data to uniquely identify us than we think. Even though we’re all pretty typical, we’re nonetheless distinctive. It turns out that if you eliminate the top 100 movies everyone watches, our movie-watching habits are all pretty individual. This is also true for our book-reading habits, our Internet-shopping habits, our telephone habits, and our web-searching habits. We can be uniquely identified by our relationships. It’s quite obvious that you can be uniquely identified by your location data. With 24/ 7 location data from your cell phone, your name can be uncovered without too much trouble. You don’t even need all that data; 95% of Americans can be identified by name from just four time/ date/ location points. [ibid, p. 44]

Companies quickly realized that they could set their own cookies on pages belonging to other sites—with their permission and by paying for the privilege— and the third-party cookie was born. Enterprises like DoubleClick (purchased by Google in 2007) started tracking web users across many different sites. This is when ads started following you around the web. Research a particular car or vacation destination or medical condition, and for weeks you’ll see ads for that car or city or a related pharmaceutical on every commercial Internet site you visit.

This has evolved into a shockingly extensive, robust, and profitable surveillance architecture. You are being tracked pretty much everywhere you go on the Internet, by many companies and data brokers: ten different companies on one site, a dozen on another. Facebook tracks you on every site with a Facebook Like button (whether you’re logged in to Facebook or not), and Google tracks you on every site that has a Google Plus + 1 button or that simply uses Google Analytics to monitor its own web traffic.

Most of the companies tracking you have names you’ve never heard of: Rubicon Project, AdSonar, Quantcast, Pulse 260, Undertone, Traffic Marketplace. If you want to see who’s tracking you, install one of the browser plugins that let you monitor cookies. I guarantee you will be startled. One reporter discovered that 105 different companies tracked his Internet use during one 36-hour period. In 2010, a seemingly innocuous site like installed over 200 tracking cookies on your browser when you visited.

It’s no different on your smartphone. The apps there track you as well. They track your location, and sometimes download your address book, calendar, bookmarks, and search history. In 2013, the rapper Jay-Z and Samsung teamed up to offer people who downloaded an app the ability to hear the new Jay-Z album before release. The app required the ability to view all accounts on the phone, track the phone’s location, and track who the user was talking to on the phone. And the Angry Birds game even collects location data when you’re not playing.

Broadband companies like Comcast also conduct surveillance on their users. These days they’re mostly monitoring to see whether you illegally download copyrighted songs and videos, but other applications aren’t far behind. Verizon, Microsoft, and others are working on a set-top box that can monitor what’s going on in the room, and serve ads based on that information.

It’s less Big Brother, and more hundreds of tattletale little brothers. [ibid, pp. 48-49]

Remember the bargains I talked about in the Introduction. The government offers us this deal: if you let us have all of your data, we can protect you from crime and terrorism. It’s a rip-off. It doesn’t work. And it overemphasizes group security at the expense of individual security.

The bargain Google offers us is similar, and it’s similarly out of balance: if you let us have all of your data and give up your privacy, we will show you advertisements you want to see— and we’ll throw in free web search, e-mail, and all sorts of other services. Companies like Google and Facebook can only make that bargain when enough of us give up our privacy. The group can only benefit if enough individuals acquiesce.

Not all bargains pitting group interest against individual interest are such raw deals. The medical community is about to make a similar bargain with us: let us have all your health data, and we will use it to revolutionize healthcare and improve the lives of everyone. In this case, I think they have it right. I don’t think anyone can comprehend how much humanity will benefit from putting all of our health data in a single database and letting researchers access it. Certainly this data is incredibly personal, and is bound to find its way into unintended hands and be used for unintended purposes. But in this particular example, it seems obvious to me that the communal use of the data should take precedence. Others disagree. [ibid, pp. 235-236]

[M]ost of the cost of privacy breaches falls on the people whose data is exposed. In economics, this is known as an externality: an effect of a decision not borne by the decision maker. Externalities limit the incentive for companies to improve their security.

You might expect users to respond by favoring secure services over insecure ones— after all, they’re making their own buying decisions on the basis of the same market model. But that’s not generally possible. In some cases, software monopolies limit the available product choice. In other cases, the “lock-in effect” created by proprietary file formats, existing infrastructure, compatibility requirements, or software-as-a-service makes it harder to switch. In many cases, we don’t know who is collecting our data; recall the discussion of hidden surveillance in Chapter 2. In all cases, it’s hard for buyers to assess the security of any data service. And it’s not just nontechnical buyers; even I can’t tell you whether or not to entrust your privacy to any particular service provider.

Liabilities change this. By raising the cost of privacy breaches, we can make companies accept the costs of the externality and force them to expend more effort protecting the privacy of those whose data they have acquired. We’re already doing this in the US with healthcare data; privacy violations in that industry come with serious fines.

And it’s starting to happen here with data from stores, as well. Target is facing several lawsuits as a result of its 2013 breach. In other cases, banks are being sued for inadequately protecting the privacy of their customers. One way to help would be to require companies to inform users about all the information they possess that might have been compromised.

These cases can be complicated, with multiple companies involved in a particular incident, and apportioning liability will be hard. Courts have been reluctant to find a value in privacy, because people willingly give it away in exchange for so little. And because it is difficult to link harms from loss of privacy to specific actions that caused those harms, such cases have been very difficult to win. [ibid, pp. 193-195]
I've been a vocal privacy advocate since before grad school --  e.g., see here, here, here, and here. Feels like a losing battle much of the time.

Moreover, a new book that just came to my attention indicates that things may be about to get considerably worse.

Much of what we think we know about privacy, liberty, security, and threat is no longer true. Much of what we have been taught about what threatens us, about what protects us, and about the risks and benefits of state power versus individual empowerment is obsolete. In the conventional understanding, international security is a state-to-state affair; the relationships between privacy, liberty, and domestic order are matters between individuals and their governments; and civilian technologies in the hands of individuals have relatively little to do with the way we order either governance at home or international security. We built the state to mediate disputes among citizens and to protect them from outside attack. We gave it power to contend with other states and to ensure it could govern effectively. Because we feared that power, we imposed constraints on it. And we imagined its power and the security it was meant to provide as being in tension with the liberty we expected it to respect. We built walls around our countries with legal concepts such as jurisdiction. And for the most part, these intellectual, conceptual, and legal constructions have held up pretty well. Yes, we had to adjust in response to Al-Qaeda and other transnational nonstate actors. And yes, globalization has complicated the discussion. But the way we think about security— what it means, where it comes from, what threatens it, what protects it, and the relationship between individual and collective societal security— has remained remarkably stable. 

In what follows, we mean to persuade you that this way of thinking is now out of date. Indeed, we argue that our debate about the fabric of security and its governance is based on dated assumptions about a technological world that no longer exists. In our new world, you can pose a threat to the security of every state and person on the planet— and each can also threaten you. In our new world, individuals, companies , and small groups have remarkable capabilities either to protect others or to make them more vulnerable. In our new world, not only do privacy and security not generally conflict, but they are often largely the same thing. And in our new world, state power represents a critical line of defense for individual freedom and privacy, even as the state itself may be losing its ability to serve its purpose as the ultimate guarantor of security to its citizens. 

Driving this new environment is a mix of technological developments. There is the radical proliferation of both data about individuals and technologies of mass empowerment available to individuals. Notwithstanding Edward Snowden’s spectacular revelations about the National Security Agency (NSA), the state’s comparative advantage in collecting data, manipulating it, and exposing individuals to risks or protecting them from threats is actually eroding, as ever bigger companies occupying ever more powerful market positions take on data collection and analytics as their business cores. The miniaturization and automation of weapons further weakens national boundaries— as well as the front door to your house— as effective lines of defense. Biological research and biotechnology are progressing at an unprecedented pace, bringing great promise— and great danger— to human security all around the globe. 

Our new environment of highly distributed threats and defenses has already changed our lives, and it will change them more in the years to come. It will change our sense of privacy, of safety, and of danger. It will change our relationships with corporations, governments, and individuals whom we have never met. It will change the way we govern our collective security and how we manage our personal safety. And it may lead us to ask questions about how we organize ourselves politically at the local, national, and international levels. 

Today, each person needs to fear an exponentially higher number of people and entities than only a decade ago. The threats to your personal security now include not merely governments and corporations but also other individuals around the world: stalkers, identity thieves, scammers, spammers, frauds, competitors, and rivals— everyone and everything from the government of China to the NSA to Luis Mijangos. You can be attacked from anywhere— and by nearly anyone.

Wittes, Benjamin; Blum, Gabriella (2015-03-10). The Future of Violence: Robots and Germs, Hackers and Drones—. Confronting A New Age of Threat (pp. 5-7). Basic Books. Kindle Edition.
I think it's time to start drinking.


Something I recently posted on my Facebook page:
Question for all of my many Facebook friends: Who owns facts about you (or rumors, or outright lies)? Can everything -- tangible and otherwise -- be "owned"? We all post "data" about ourselves here on FB. Do we give up "ownership" by doing so? Facebook (and all social media) is data-mining the crap out of all of us all the time, and selling the results to the highest bidders 24/7. Have we, via the ToS, given them blanket "license"?

It's a more difficult question than it might appear.

I would love to hear your opinions. What do we even mean by the word "own."? I usually first look to my trusty Black's Law Dictionary:

Owner. One who has the right to possess, use, and to convey something; a proprietor. See Ownership.

Ownership. The collection of rights allowing one to use and enjoy property, including the right to convey it to others. Ownership implies the right to possess a thing, regardless of any actual or constructive control. Ownership rights or general, permanent, and inheritable. Cf. POSSESSION; TITLE.

“Possession is the de facto exercise of a claim; ownership is the dish your recognition of one. The thing owned by me when my claim to it is maintained by the will of the state as expressed in the law; it is possessed by me, when my claim to it is maintained by my own self assertive will. Ownership is the guarantee of the law; possession is the guarantee of the facts. It is well to have both forms if possible; and indeed they normally coexist.”

John Salmond, Jurisprudence 311 (Glanville L. Williams ed., 10th edition 1947).

“Ownership does not always mean absolute dominion. The more an owner, for his advantage, opens up his property for use by the public in general, the more do his rights become circumscribed by the statutory and constitutional powers of those who use it.” Marsh v Alabama 326 US 501, 506, 66S. CT. 276, 278 (1946) (Black, J.).

Garner, Black’s Law Dictionary, 7th Ed.


That is not dispositively helpful to me, Begs some trailing questions. Even Black's nowhere includes the phrase "exclusive right(s)."

Mostly I'm interested in this stuff for the implications going to my work in medical information technology (who owns your medical data?), but it goes well beyond that. Taylor Swift, for example, not only copyrights her songs, but she (through her lawyers) trademarks specific phrases within her lyrics, down to very fine detail. Use the phrase "Lucky 13"? You owe Taylor money. And they will come after you if you're doing anything commercial with the phrase.

I am 69 years old. 5'10", 180 lbs, blue eyes, left-handed. Married (35 years), two surviving kids. Who owns those facts? Are they simply "public domain"? Anyone can use them for whatever purpose they wish?

What about, I go to a bar, have a couple of drinks, and some gumshoe swipes my glass just after I leave, and has it assayed for my genome. Maybe there's a paternity suit in the wings. Who "owns" my genetic data?

What is legitimately "private"?

I can give you all kinds of difficult scenarios.

I would love to hear your thoughts. I respect all of your opinions.

Been thinking about this stuff a lot lately on my blog.

More to come...

A quick jaunt into The Stupid

What can one say? Is this a great country, or what? I know there's a semantic difference between "stupid" and "ignorant," but the two often overlap, and you have to wonder here.

Saturday, March 21, 2015

Meaningful Use Stage 3 proposed rule is now out

Stage 3 meaningful use proposed rule and certification criteria released
Aim to 'support the path to nationwide interoperability'

WASHINGTON | March 20, 2015, Healthcare IT News

The new Stage 3 meaningful use rules proposed by the Centers for Medicare & Medicaid Services seek to give providers more flexibility, simplify the program, drive interoperability among electronic health records and put the focus on improved patient outcomes.
  • The Stage 3 proposed rule can be read here.
  • New 2015 Edition IT certification criteria can be seen here.
CMS says the Stage 3 rules are meant to drive better-quality, more cost-effective and coordinated care by improving the way providers are paid and – crucially – bolstering better information sharing.

"The flow of information is fundamental to achieving a health system that delivers better care, smarter spending  and healthier people," said HHS Secretary Sylvia M. Burwell in a press statement. "The steps we are taking today will help to create more transparency on cost and quality information, bring electronic health information to inform care and decision making, and support population health."

CMS touts the proposed rule's "flexibility," and points to simplified requirements for providers that keep a focus on advanced use of EHRs – and scrap box-checking requirements that are no longer relevant.

Stage 3 "does three things: It helps simplify the meaningful use program, advances the use of health IT toward our vision for improving health delivery, and further aligns the program with other quality and value programs,” said Patrick Conway, MD, acting principal deputy administrator and chief medical officer at CMS, in a statement.

He added that, "in an effort to make reporting easier for health care providers, we will be proposing a new meaningful use reporting deadline soon."

Meanwhile, the Office of the National Coordinator for Health IT has proposed a new 2015 Edition Health IT Certification Criteria rule that aligns with the data exchange goals put forth in its Nationwide Interoperability Roadmap. The proposed rule includes new and updated IT functionality and provisions that support care improvement, cost reduction, and patient safety across the health system.

"The certification criteria we have proposed in the 2015 Edition will help achieve that vision through provisions that consider the range of health IT users and uses across the care continuum, including those focused on interoperable standards, data portability, improved transparency, privacy and security capabilities, and increased oversight," said National Coordinator Karen DeSalvo, MD, in a statement...
431 pages in the 2015 Certification Criteria. Some counts of keywords I find of immediate interest, randomly:
"Interoperability" -- 71
"API(s)" -- 65
"standards" -- 454
"meaningful use" -- 29
"innovation" -- 8
"market(s)" -- 5
"patient safety" -- 35
"workflow(s) -- 17
"usability" -- 14
"certification" -- 1,000+
Counts of the same keywords in the 301 page MU3 Proposed Rule:
"Interoperability" -- 6
"API(s)" -- 75
"standards" -- 56
"meaningful use" -- 380
"innovation" -- 6
"market(s)" -- 9
"patient safety" -- 10
"workflow(s) -- 18
"usability" -- 0
"certification" -- 78
A lot to study. From the 2015 Cert document:
I. Executive Summary

A. Purpose of Regulatory Action

Building on past rulemakings, this proposed rule further identifies how health IT certification can support the establishment of an interoperable nationwide health information infrastructure. It reflects stakeholder feedback received through various outreach initiatives, including the regulatory process, and is designed to broadly support the health care continuum through the use of certified health IT. To achieve this goal, this rule proposes to:

  • Improve interoperability for specific purposes by adopting new and updated vocabulary and content standards for the structured recording and exchange of health information, including a Common Clinical Data Set composed primarily of data expressed using adopted standards; and rigorously testing an identified content exchange standard (Consolidated Clinical Document Architecture (C-CDA)):
  • Facilitate the accessibility and exchange of data by including enhanced data portability, transitions of care, and application programming interface (API) capabilities in the 2015 Edition Base EHR definition;
  • Establish a framework that makes the ONC Health IT Certification Program open and accessible to more types of health IT, health IT that supports a variety of care and practice settings, various HHS programs, and public and private interests;
  • Support the Medicare and Medicaid EHR Incentive Programs (EHR Incentive Programs) through the adoption of a set of certification criteria that align with proposals for Stage 3;
  • Address health disparities by providing certification: to standards for the collection of social, psychological, and behavioral data; for the exchange of sensitive health information (Data Segmentation for Privacy); and for the accessibility of health IT;
  • Ensure all health IT presented for certification possess the relevant privacy and security capabilities;
  • Improve patient safety by: applying enhanced user-center design principles to health IT, enhancing patient matching, requiring relevant patient information to be exchanged (e.g., Unique Device Identifiers), improving the surveillance of certified health IT, and making more information about certified products publicly available and accessible;
  • Increase the reliability and transparency of certified health IT through surveillance and disclosure requirements; and
  • Provide health IT developers with more flexibility and opportunities for certification that support both interoperability and innovation.
From the MU3 Proposed Rule Executive Summary:
In this proposed rule, we specify the policies that would be applicable for Stage 3 of the Medicare and Medicaid EHR Incentive Programs. Under Stage 3, we are proposing a set of requirements that EPs, eligible hospitals, and CAHs must achieve in order to meet meaningful use, qualify for incentive payments under the Medicare and Medicaid EHR Incentive Programs, and avoid downward payment adjustments under Medicare. These Stage 3 requirements focus on the advanced use of certified EHR technology (CEHRT) to promote health information exchange and improved outcomes for patients.
Stage 3 of meaningful use is expected to be the final stage and would incorporate portions of the prior stages into its requirements. In addition, following a proposed optional year in 2017, beginning in 2018 all providers would report on the same definition of meaningful use at the Stage 3 level regardless of their prior participation, moving all participants in the EHR Incentive Programs to a single stage of meaningful use in 2018. The incorporation of the requirements into one stage for all providers is intended to respond to stakeholder input regarding the complexity of the program, the success of certain measures which are part of the meaningful use program to date, and the need to set a long-term, sustainable foundation based on a consolidated set of key advanced use objectives for the Medicare and Medicaid EHR Incentive Programs.

In addition, we propose changes to the EHR reporting period, timelines, and structure of the Medicare and Medicaid EHR Incentive Programs. We believe these changes would provide a flexible, clear framework to reduce provider burden, streamline reporting, and ensure future sustainability of the Medicare and Medicaid EHR Incentive Programs. These changes together lay a foundation for our broader efforts to support interoperability and quality initiatives focused on improving patient outcomes.
A lot to study.

Mixed Industry Response to Stage 3 Meaningful Use Rule

The proposed Stage 3 Meaningful Use rule released March 20 received early mixed reviews from two stakeholder associations.

The rule, focusing on advanced use of electronic health records with fewer objectives, is meant to provide more flexibility and simplify requirements for providers, according to the Centers for Medicare and Medicaid Services. It specifies the criteria that eligible professionals, eligible hospitals, and critical access hospitals must meet to qualify for Medicare and Medicaid EHR incentive payments and avoid financial penalties under Medicare for Stage 3 of the EHR Incentive Programs.

The rule proposes changes to the reporting period, timelines, and structure of the program to provide a “flexible, clear framework to reduce provider burden, streamline reporting, and ensure future sustainability of the Medicare and Medicaid EHR Incentive Programs,” according to CMS.

Specifically, the rule would “continue to encourage electronic submission of clinical quality measure (CQM) data for all providers where feasible in 2017, propose to require the electronic submission of CQMs where feasible in 2018, and establish requirements to transition the program to a single stage for meaningful use.” In addition, the Stage 3 proposed rule “would also change the EHR reporting period so that all providers would report under a full calendar year timeline with a limited exception under the Medicaid EHR Incentive Program for providers demonstrating meaningful use for the first time.”

In a written statement, the College of Healthcare Information Management Executives said it is closely evaluating the CMS Meaningful Use rule. Based on an initial review, CHIME said the organization is “pleased” to see flexibility built into the Stage 3 proposed objectives...

However, with eligible hospitals and professionals continuing to struggle to attest to Stage 2 MU and with Stage 3 MU not slated to begin until 2017, it’s difficult for some providers to look beyond their current challenges with the program.

Not surprisingly, the American Hospital Association was quick to condemn the proposed rule which the organization says demonstrates that CMS “continues to create policies for the future without fixing the problems the program faces today,” according to a written AHA statement.

“In January, CMS promised to provide much-needed flexibility for the 2015 reporting year, which is almost half over,” states AHA. “Instead, CMS released Stage 3 rules that pile additional requirements onto providers.  It is difficult to understand the rush to raise the bar yet again, when only 35 percent of hospitals and a small fraction of physicians have met the Stage 2 requirements.”...
Health IT stakeholders march on DC


In its first stage, the meaningful use program delivered billions of dollars to technology companies that specialize in manufacturing EHR software, and ensured that clinicians of all types are no longer wasting time and resources on individual flesh and blood patients, but instead are meticulously collecting computable data for the sharing economy. This preliminary phase saw an order of magnitude increase in the number of small EHR companies and public/private, not-for-profit, certification and accreditation enterprises, along with a sharp decrease in the number of small medical practices. The second phase of the program, weeded out most new entrants into the EHR software market, solidifying the gains for large technology vendors. Physicians became disenchanted, lost interest and lost joy in their profession. Participation in the program plummeted posing a real threat to desired outcomes. Not to worry though, there is enacted regulation and legislation pending to crack a bigger and better whip on dissenters.
The brand new third phase of the meaningful use program sports a “keep your eyes on the prize” attitude and is forging ahead towards the finish line, bravely oblivious to the difficulties experienced in previous stages. Of course, six years into the program, one would expect to see some results indicating that all this money we are spending is moving the needle towards meeting the stated “do no evil” targets. There are no such preliminary results, and we are told that it is too soon to ask, because we won’t be able to see real improvements until the entire program, which is getting bigger and more expensive with each passing day, is completed. In the meantime, we are advised to entertain ourselves with an interminable stream of roadmaps, peppered with gaudy infographics supported by toddler level cartoonish videos, and continue to pay our taxes, leaving the thinking and planning to smarter people.
I will not waste your time with point by point analysis of the new meaningful use regulations, because I am certain the “industry” will produce the customary collateral, and because it is basically more of the same...
If a clinician has 12 minutes to see a patient, be empathetic, document the entire visit with sufficient granularity to justify an ICD-10 code, achieve 140 quality measures, never commit malpractice, and broadly communicate among the care team, it's not clear how the provider has time to perform a "clinical information reconciliation" that includes not only medications and allergies, but also problem lists 80 percent of the time.

Maybe we need to reduce patient volumes to 10 per day? Maybe we need more scribes or team-based care? And who is going to pay for all that increased effort in an era with declining reimbursements/payment reform?

- John Halamka, "The good, the bad and the ugly of Stage 3 MU"

Meaningful use stage 3: Making mHealth tools a necessity?
March 27, 2015 | Eric Wicklund - Editor, mHealthNews
mHealth is coming of age, thanks to meaningful use Stage 3.

That's the consensus of thought one week after the release of the proposed guidelines by the Centers for Medicare and Medicaid Services. The 300+ page document, in fact, makes clear for the first time that mHealth technologies – from secure messaging platforms to mobile devices – play a specific role in bringing the provider up to speed with today's healthcare landscape.

"MU 3, welcome to the 21st century," Ricky Bloomfield, MD, Duke Medicine's director of mobile technology strategy – who's working with Apple's HealthKit platform – recently wrote in his blog, "The Mobile Doc."

But some are also wondering whether legislating mHealth – or putting overworked providers under more pressure to change their workflows – will make them any more willing to adopt it.

Patient-generated data had long been considered by the Office of the National Coordinator for Health IT in this third round of incentives for healthcare providers. And the proposed rule, issued March 20, expands the avenues by which providers can converse with patients to include apps, portals and direct messaging. It also makes provisions for provider use of patient-generated data – from apps and wearables, for example.

Some experts are wondering, though, whether the ONC is being too specific in pushing mHealth on the provider community...
I'm all for "mHealth," but this could end up being wildly out of control and unsuccessful.

More to come...

Wednesday, March 18, 2015

March 2015 Senate committee Interoperababble hearing

I'm not sure how long the video will be up on the committee website. They have pdf files of the prepared witness testimonies on the site. There's a short edited hearing excerpt on YouTube here:

I watched the entire hour and 45 minutes. I've emailed to inquire as to whether there will be a transcript. It was a good hearing, as these things go. Very collegially bipartisan. Lots of concerns aired about the progress and status of the Meaningful Use program, specifically aimed around the continuing lack of effective and widespread "interoperability"/data exchange. (What I have coined as "Interoperababble®.")


As reported by iHealthBeat:
Senate Committee Addresses Meaningful Use, Interoperability

During a Senate Health, Education, Labor and Pensions Committee hearing on Tuesday, senators and stakeholders discussed issues regarding the meaningful use program and barriers to achieving interoperability, Healthcare Informatics reports.

Under the 2009 economic stimulus package, providers who demonstrate meaningful use of certified electronic health records can qualify for Medicaid and Medicare incentive payments (Perna, Healthcare Informatics, 3/17).

Hearing Details on Meaningful Use
In an opening statement, Senate committee Chair Lamar Alexander (R-Tenn.) said that "evidence suggests" that while the meaningful use program has spent $30 billion, it has so far failed to deliver on its promises to "improve care, improve coordination and reduce costs."

Specifically, Alexander noted that eligible professionals and hospitals have struggled with the program to the point where CMS has been forced to delay or update its requirements three times. He added, "Half of physicians have not met the requirements of the program and are now facing penalties"...
Yeah, OK, "so far failed to deliver..."

From my March 6th post:
For one thing, I have noted before what I call "Health IT Policy ADHD." Major legislation gets passed and funded, and when we don't get immediate, dazzling results, we go sour on it, lamenting its "failure," and calling for its demise. HITECH is not that old. There have really only been four years of full-bore boots-on-the-ground operation. REC contracts were let in 2010, and the RECs spent most of their first year getting their sea legs under them and scurrying about hustling skeptical clinical participants.


Gotta love it. From Healthcare IT News.
Epic trades jabs with CommonWell Health Alliance
Epic's 'rhetoric is a slap in the face'

Comments by Epic's head of interoperability in a Senate HELP Committee Tuesday have triggered members of the CommonWell Health Alliance, the subject of some of the comments, to fire back at the Verona, Wisconsin-based EHR giant.

Peter DeVault, Epic's director of interoperability, spoke before the Senate committee on the topic of interoperability and Epic's role in moving it forward. In the questions portion of the hearing, Senator Tammy Baldwin, D-Wisconsin, shifted the conversation, asking DeVault candidly: Why isn't Epic a part of CommonWell?...
The two-year old CommonWell has 1,000 physicians live on it, he said, compared to 100,000 physicians on Epic's Care Everywhere. Healthcare IT News reached out to CommonWell officials for the most up-to-date number but did not receive a response by publication time.
DeVault's comments didn't sit well with one of CommonWell's founding members (and one of Epic's chief competitors), Cerner.
His "rhetoric is a slap in the face to many parties working to advance interoperability," according to a statement released by Cerner officials shortly after the committee hearing. "It was discouraging to hear more potshots and false statements when it's clear there is real work to be done. We're committed to CommonWell as a practical, market-led way to achieve meaningful interoperability."
There's been tension between Epic and CommonWell ever since the latter group's launch at the 2013 HIMSS Annual Conference & Exhibition.
At that HIMSS13 announcement, athenahealth CEO Jonathan Bush emphasized that anyone was invited to CommonWell – even a vendor of "epic proportions."...

More, from Modern Healthcare:
'Doc fix' bill would overhaul health IT policy, too
By Darius Tahir  | March 19, 2015

The bill introduced Thursday to replace Medicare's sustainable growth-rate formula for physician pay would also significantly alter federal policy on health information technology...

Lawmakers have ... used the legislation to take aim at obstacles to realizing the benefits of IT in healthcare, particularly the lack of interoperability, or data-sharing, between electronic health records.

The SGR bill establishes a July 2016 deadline for HHS to develop metrics to quantify progress toward more data-sharing among hospitals and other providers. HHS would have to account for the progress by December 2018...
This bears watching. A long road yet to an actual law, though.

More to come...

Monday, March 16, 2015

"Your use of the Services constitutes your agreement to the Privacy Policy"

A new app service is touted as "The OpenTable for Healthcare."

"It's Free!" 
 OK, but, wait, there's more...

One of my Facebook pals posted this stuff to my wall. It links to a PBS story.
App’s terms of service give away your SSN, medical history

Do you know what you’re agreeing to when you click “I agree” on a website’s terms of service form?

In all likelihood, the answer is no. To read just the privacy statement from every different website they visit in a year, Americans would have to dedicate more than 30 eight-hour work days to the mind-numbing task, according to one study. And the privacy policy is only one part of a website’s terms of service.

Yet by signing terms of service, users may cede control of their intellectual property, agree to be used as research subjects and allow companies to collect and distribute their personal information, including, perhaps, medical information.

NewsHour Weekend Anchor and Senior Correspondent Hari Sreenivasan was troubled recently when he received an email from ZocDoc, a popular medical care scheduling service, describing the company’s updated terms of use...
My reaction on Facebook.

Recall the privacy concerns aired in my prior post "Wearables update: The iWatch"?

Also, tangentially, a concern I tweeted while covering the Health 2.0 WinterTech Conference back in January:

ZocDoc: "It's Free!" OK, then, what's the business model? "VC Built to Flip? Trafficking in patients' data? Like Facebook and Twitter and all of these other "free" apps do with consumers' personal data?

Do you really want perhaps granting Amazon's Jeff Bezos (ZocDoc investor) access to your medical data?

Just asking.


OK, going back to the app I cited while covering WinterTech 2015.

From the "Notice of Privacy Practices" on their websites:

How is Patient Privacy Protected?
As the healthcare providers providing online medical services through Doctor on Demand (the “Healthcare Providers”, “us”, “we”, “our”), we understand that information about you and your health is personal. Because of this, we strive to maintain the confidentiality of your health information. We continuously seek to safeguard that information through administrative, physical and technical means, and otherwise abide by applicable federal and state guidelines.

How do we use and disclose health information?
We use and disclose your health information for the normal business activities that the law sees as falling in the categories of treatment, payment and health care operations. Below we provide examples of those activities, although not every use or disclosure falling within each category is listed:
  • Treatment – We keep a record of the health information you provide us. This record may include your test results, diagnoses, medications, your response to medications or other therapies, and information we learn about your medical condition through the online services. We may disclose this information so that other doctors, nurses, and entities such as laboratories can meet your healthcare needs.
  • Payment – We document the services and supplies you receive when we are providing care to you so that you, your insurance company or another third party can pay us. We may tell your health plan about upcoming treatment or services that require prior approval by your health plan.
  • Health Care Operations – Health information is used to improve the services we provide, to train staff and students, for business management, quality improvement, and for customer service. For example, we may use your health information to review our treatment and services and to evaluate the performance of our staff in caring for you.
We may also use your health information to:
  • Comply with federal, state or local laws that require disclosure.
  • Assist in public health activities such as tracking diseases or medical devices.
  • Inform authorities to protect victims of abuse or neglect.
  • Comply with Federal and state health oversight activities such as fraud investigations.
  • Respond to law enforcement officials or to judicial orders, subpoenas or other process.
  • Inform coroners, medical examiners and funeral directors of information necessary for them to fulfill their duties.
  • Facilitate organ and tissue donation or procurement.
  • Conduct research following internal review protocols to ensure the balancing of privacy and research needs.
  • Avert a serious threat to health or safety.
  • Assist in specialized government functions such as national security, intelligence and protective services.
  • Inform military and veteran authorities if you are an armed forces member (active or reserve).
  • Inform a correctional institution if you are an inmate.
  • Inform workers’ compensation carriers or your employer if you are injured at work.
  • Recommend treatment alternatives.
  • Tell you about health-related products and services.
  • Communicate within our organization for treatment, payment, or health care operations.
  • Communicate with other providers, health plans, or their related entities for their treatment or payment activities, or health care operations activities relating to quality assessment or licensing.
  • Provide information to other third parties with whom we do business, such as a record storage provider. However, you should know that in these situations, we require third parties to provide us with assurances that they will safeguard your information.
"Treatment," Payment," "Health care Operations" is simply HIPAA Covered Entity language. The final bullet point begs some questions.
"Provide information to other third parties with whom we do business, such as a record storage provider. However, you should know that in these situations, we require third parties to provide us with assurances that they will safeguard your information."
What comprises "assurances that they will safeguard your information"?

Simple verbal or written "assurances"? Or legally-defensible documentation showing that their 3rd parties are in full compliance with HIPAA as it now pertains to BA's? That's what I'd want to see.


I hope the OCR and OIG will be doing their jobs with respect to these rapidly proliferating health apps developer BA's that will be trafficking in ePHI. I intend to raise the issue.

See my 2012 post "45.CFR.164.3, 45.CFR.164.5, and 42.CFR.2." Again, HIPAA compliance is not for policy dilettantes. And, the requisite HIPAA Privacy and Security Officers are not simply hanging around Home Depot parking lots looking for day work.

More to come...