The President will ask the 2015 Congress for a couple hundred million dollars to spur applied research into genomic-based "precision medicine." Below, from the current issue of The New Yorker:
The Problem with Precision Medicine
BY CYNTHIA GRABER
Last Friday, in a speech at the White House, President Obama unveiled what he called his Precision Medicine Initiative, a two-hundred-and-fifteen-million-dollar plan to collect genetic information from a million American volunteers in order to further the development of personalized, genetics-based medical treatments. Obama called precision medicine “one of the greatest opportunities for new medical breakthroughs that we have ever seen,” saying that it promised to deliver “the right treatments at the right time, every time, to the right person.” So far, however, the excitement surrounding personalized medicine has outpaced the science. DNA testing has become increasingly useful in the detection and treatment of various conditions, including cancer, intellectual developmental delays, birth defects, and diseases of unknown origin, and the cost of genetic analyses has dropped even as the speed with which their results are delivered has risen. Nevertheless, for most people, genetic medicine is not yet delivering customized care. As scientists continue to draw connections between DNA data and health outcomes, the problem of interpretability continues to grow. Many doctors are simply not qualified to make sense of genetic tests, or to communicate the results accurately to their patients.Indeed. And, in addition to the chronic paucity of "omics"-skilled clinical analysts, what additional (unfunded) Health IT burden will devolve down to the physician with the addition of the myriad "omics" data into her EHR?
David Miller, a geneticist at Boston Children’s Hospital, ran into that problem last fall, when a couple brought their five-year-old daughter to see him. The girl had poor coördination and was short for her age, and she was prone to infections. Her previous physician had ordered a DNA test to determine whether her physical developmental delays were tied to a known genetic condition. If they were, perhaps the results could suggest a course of care. When the test came back, it showed a missing fragment of code on chromosome 22—the telltale marker for something called DiGeorge syndrome. The prognosis that the physician gave the girl’s parents was dire: common symptoms of DiGeorge include learning and growth delays and heart defects, and patients are at an increased risk of developing psychiatric disorders such as schizophrenia. There is no known cure. When Miller reëxamined the test results, however, he noticed that the deletion on chromosome 22 was not in the same location as the one that causes DiGeorge; it was likely an insignificant genetic blip. The girl did not have the syndrome, and, her parents were relieved to hear, there was no need to monitor her heart and mental health.
Mistakes such as the one that Miller caught are unfortunately common, according to the medical geneticists and genetic counselors with whom I’ve spoken. (Medical geneticists are physicians who have gone through medical school and then trained in genetics; genetic counselors have obtained a specialized master’s degree.) And those mistakes can cause greater harm than merely stoking the anxieties of a sick person or her parents...
Part of the dearth of genetics expertise among physicians stems from the fact that many of those currently practicing went to medical school before the human genome was sequenced. Mary Norton, a clinical geneticist in the field of high-risk obstetrics at the University of California, San Francisco, told me that when she took her board exams, about twenty years ago, she had to memorize all of the genes that had been identified and associated with diseases. At the time, there were fewer than a dozen; now a single panel might test for a hundred. “It’s very complicated, especially for generalists, who have a million other things on their minds besides genetics,” she told me...
In theory, doctors could turn to specialists to fill in their gaps in knowledge, but, depending on where they live, they may be hard pressed to find someone qualified. According to the American College of Medical Genetics and Genomics (A.C.M.G.), there aren’t enough trained medical geneticists to fill all the jobs currently on offer. As a result, according to Rajkovic, doctors in need of an education in genetic tests receive instruction from the testing companies themselves—the same companies that, as he notes in his DiGeorge case study, tend to push new products without sufficient evidence of their efficacy. “In this race to offer more value, they are jumping the gun, in my opinion,” Rajkovic told me. Many geneticists also pointed out that companies’ marketing materials make it seem as though the tests are infallible. As the testing firm Sequenom puts it in one advertisement: “Positive or negative results. Never maybe.”
General-care physicians seem to understand that their lack of training is a hindrance. Last fall, a survey appeared in the journal Genetics in Medicine that examined data from thirty-eight other studies; a majority of respondents expressed reservations about not fully understanding test results or not being able to devote the time necessary to discuss testing options and outcomes with patients...
Speaking of therapeutics, The New Yorker is on a roll with this issue.
Annals of Medicine FEBRUARY 9, 2015 ISSUEVery interesting, lengthy article. Highly recommended.
The Trip Treatment
Research into psychedelics, shut down for decades, is now yielding exciting results.
BY MICHAEL POLLAN
On an April Monday in 2010, Patrick Mettes, a fifty-four-year-old television news director being treated for a cancer of the bile ducts, read an article on the front page of the Times that would change his death. His diagnosis had come three years earlier, shortly after his wife, Lisa, noticed that the whites of his eyes had turned yellow. By 2010, the cancer had spread to Patrick’s lungs and he was buckling under the weight of a debilitating chemotherapy regimen and the growing fear that he might not survive. The article, headlined “HALLUCINOGENS HAVE DOCTORS TUNING IN AGAIN,” mentioned clinical trials at several universities, including N.Y.U., in which psilocybin—the active ingredient in so-called magic mushrooms—was being administered to cancer patients in an effort to relieve their anxiety and “existential distress.” One of the researchers was quoted as saying that, under the influence of the hallucinogen, “individuals transcend their primary identification with their bodies and experience ego-free states . . . and return with a new perspective and profound acceptance.” Patrick had never taken a psychedelic drug, but he immediately wanted to volunteer. Lisa was against the idea. “I didn’t want there to be an easy way out,” she recently told me. “I wanted him to fight.”
Patrick made the call anyway and, after filling out some forms and answering a long list of questions, was accepted into the trial. Since hallucinogens can sometimes bring to the surface latent psychological problems, researchers try to weed out volunteers at high risk by asking questions about drug use and whether there is a family history of schizophrenia or bipolar disorder. After the screening, Mettes was assigned to a therapist named Anthony Bossis, a bearded, bearish psychologist in his mid-fifties, with a specialty in palliative care. Bossis is a co-principal investigator for the N.Y.U. trial.
After four meetings with Bossis, Mettes was scheduled for two dosings—one of them an “active” placebo (in this case, a high dose of niacin, which can produce a tingling sensation), and the other a pill containing the psilocybin. Both sessions, Mettes was told, would take place in a room decorated to look more like a living room than like a medical office, with a comfortable couch, landscape paintings on the wall, and, on the shelves, books of art and mythology, along with various aboriginal and spiritual tchotchkes, including a Buddha and a glazed ceramic mushroom. During each session, which would last the better part of a day, Mettes would lie on the couch wearing an eye mask and listening through headphones to a carefully curated playlist—Brian Eno, Philip Glass, Pat Metheny, Ravi Shankar. Bossis and a second therapist would be there throughout, saying little but being available to help should he run into any trouble...
I met Bossis last year in the N.Y.U. treatment room, along with his colleague Stephen Ross, an associate professor of psychiatry at N.Y.U.’s medical school, who directs the ongoing psilocybin trials. Ross, who is in his forties, was dressed in a suit and could pass for a banker. He is also the director of the substance-abuse division at Bellevue, and he told me that he had known little about psychedelics—drugs that produce radical changes in consciousness, including hallucinations—until a colleague happened to mention that, in the nineteen-sixties, LSD had been used successfully to treat alcoholics. Ross did some research and was astounded at what he found.
“I felt a little like an archeologist unearthing a completely buried body of knowledge,” he said. Beginning in the nineteen-fifties, psychedelics had been used to treat a wide variety of conditions, including alcoholism and end-of-life anxiety. The American Psychiatric Association held meetings centered on LSD. “Some of the best minds in psychiatry had seriously studied these compounds in therapeutic models, with government funding,” Ross said.
Between 1953 and 1973, the federal government spent four million dollars to fund a hundred and sixteen studies of LSD, involving more than seventeen hundred subjects. (These figures don’t include classified research.) Through the mid-nineteen-sixties, psilocybin and LSD were legal and remarkably easy to obtain. Sandoz, the Swiss chemical company where, in 1938, Albert Hofmann first synthesized LSD, gave away large quantities of Delysid—LSD—to any researcher who requested it, in the hope that someone would discover a marketable application. Psychedelics were tested on alcoholics, people struggling with obsessive-compulsive disorder, depressives, autistic children, schizophrenics, terminal cancer patients, and convicts, as well as on perfectly healthy artists and scientists (to study creativity) and divinity students (to study spirituality). The results reported were frequently positive. But many of the studies were, by modern standards, poorly designed and seldom well controlled, if at all. When there were controls, it was difficult to blind the researchers—that is, hide from them which volunteers had taken the actual drug. (This remains a problem.)
By the mid-nineteen-sixties, LSD had escaped from the laboratory and swept through the counterculture. In 1970, Richard Nixon signed the Controlled Substances Act and put most psychedelics on Schedule 1, prohibiting their use for any purpose. Research soon came to a halt, and what had been learned was all but erased from the field of psychiatry. “By the time I got to medical school, no one even talked about it,” Ross said...
“Ineffability” is a hallmark of the mystical experience. Many struggle to describe the bizarre events going on in their minds during a guided psychedelic journey without sounding like either a New Age guru or a lunatic. The available vocabulary isn’t always up to the task of recounting an experience that seemingly can take someone out of body, across vast stretches of time and space, and include face-to-face encounters with divinities and demons and previews of their own death...Yeah. Reminds me of something I posted on another of my blogs: A speculation on the "afterlife."
I wrote my first song 45 years ago, in 1969. Mushroom-assisted. OK to admit that kind of stuff now, I guess.
Also noteworthy in my periodicals, apropos of "wearable" health tech. From The Atlantic:
What My Hearing Aid Taught Me About the Future of WearablesAnother fine read. Well worth your time.
As human-enhancing technology becomes tinier and more advanced, the price of progress is complexity.
I was into wearables before there was Google Glass, Apple Watch, or the Moto 360. I was into them before cheap devices told you how much you had walked, run, slept, or eaten. In fact, I’ve been into them for so long now that I’m not quite sure when it started. I think it was around when I was 5, in 1986.
The wearables I started wearing as a kid and still wear today are hearing aids—or, as my audiologist euphemistically calls them, "amplification devices." Although many will never need hearing aids, today’s tech firms are making it likely that, someday soon, tiny computers will become extensions of your body, just as they have been part of mine for nearly 30 years. Thanks to that experience, I feel as though I’ve had a sneak peek into our wearable future—and I can make some predictions about what it will look like.
To be fair, hearing aids are quite different from the current array of consumer wearables. Hearing aids are medical devices designed to make up for a physical impairment. By contrast, consumer wearables like the Apple Watch are luxury items that let us read text messages and measure our fitness. This distinction has legal significance: The FDA tightly regulates any device that tries to either diagnose or treat a medical condition. That means certain features are unlikely to ever exist in a consumer wearable, unless Tim Cook wants to sell watches that require a doctor’s prescription.
But despite initial appearances, both medical and consumer wearables share a few important goals.
Broadly speaking, both types of wearables aim to fill gaps in human capacity. As Sara Hendren aptly put it, "all technology is assistive technology." While medical devices fill gaps created by disability or illness, consumer wearables fill gaps created by being human. For example, evolution hasn’t given us brain wi-fi, yet.
Both kinds of wearables also need to justify being attached to our bodies. This seems pretty obvious for hearing aids, but it is just as true for consumer devices. A wearable that serves as only a slightly more convenient screen for your phone is hardly reason for the average person to spend hundreds of dollars. Instead, wearables need to offer a feature that works best when in close contact with your body, like measuring heart rate or offering haptic feedback.
Also, both types of wearables need to embed themselves seamlessly into our experiences. If a wearable obstructs your experience of the real world, or is a distraction, it’s likely to end up on a shelf instead of your wrist. That’s not to say that they don’t take getting used to—even after a lifetime of wearing hearing aids, it still takes me several weeks to adjust to a new pair. But after that period, a well-made wearable should seem like a seamless extension of our bodies.
In my current role at the Berkman Center for Internet & Society at Harvard University, I’m lucky to be able to study something I care deeply about: technology’s impact on our lives. I’m sure my interest partly arises from how I’ve depended on technology for as long as I can remember. I don’t know with certainty how consumer wearables will develop, but what I do know is how much hearing aids have changed over the last 30 years. And I have some insight into what sensory-enhancing wearables—like hearing aids, and unlike data-recording wearables like pedometers—could someday become. Over the next few years, I expect that we will see four trends, rich in both opportunity and peril, shape the evolution of these wearables from toys into tools...
The price of progress ... is complexity. Older hearing aids had limited customization, altered sounds in very basic and predictable ways, failed in obvious ways, and didn’t collect data. Now things are different. The endless customization available in new aids creates more opportunities for mistakes. The complex algorithms make it harder to diagnose problems. The total substitution of experience stifles attempts to identify errors. And increasing data collection means hearing aids may soon have to grapple with thorny issues of privacy.__
The same holds true for consumer wearables. If they follow the path of hearing aids, future generations of wearables will be more immersive, more complex, more difficult to troubleshoot, and more pervasive in their data collection. As long as we see wearables as toys or luxury goods, it is easy to write off these challenges. But there is a real opportunity for wearables to improve the lives of many in substantial ways just as they’ve improved my life since 1986. To realize those improvements, we cannot ignore these trends, and we must take wearables seriously as the indispensable tools they will soon become.
erratum, FROM ALEXANDRA DRANE at THCB:
Most of us are pretty fundamentally lonely. Lost in a world of rapidly increasing communication mechanisms that with all their connecting are still, somehow, leaving us feeling incredibly alone. Alone in a world where it looks like everybody else is doing just fine (‘Check out how happy I am in this Facebook photo!!’) – and we’re the only ones kind of failing at everything.
Let’s compare those two lists – what matters to me as a human, and what matters to the healthcare system. What matters to me is my financial stress, my caregiver stress, my relationship stress, my job stress….and what matters to the healthcare system is checking the boxes on ‘quality’ metrics designed solely around a traditional definition of health.Yeah. Interesting observations below,
DAVID SIMON: [U]ltimately, capitalism has not delivered on the promise to be a measurement of anything other than money, of profit. And if profit is your only metric, man, what are you building? Where does the environment fit into that? Where does human potential and you know, for anything other than having some money in your hand, you know, where does, where do people stand when they have health needs or when they make a mistake in life? You know, it was said a long time ago you judge a society by is hospitals and its prisons. By that standard we're, you know, we have a lot to be ashamed of...___
More to come...