Search the KHIT Blog

Wednesday, April 4, 2018

"The unsinkable rubber duckies of pseudoscience"


An excellent book I ran across cited at my requisite priority daily stop, Science Based Medicine.


From the preface:
We find ourselves living increasingly in a “post-truth” world, one in which emotions and opinions count for more than well-established findings when it comes to evaluating assertions. In much of contemporary Western culture, such catchphrases as “Don’t confuse me with the facts,” “Everyone is entitled to my opinion,” and “Trust your gut” capture a troubling reality, namely, that many citizens do not— and in some cases, apparently cannot— adequately distinguish what is true from what they wish to be true. This overreliance on the “affect heuristic,” the tendency to gauge the truth value of a proposition based on our emotional reactions to it (Slovic, Finucane, Peters, and MacGregor, 2007), frequently leads us to accept dubious assertions that warm the cockles of our hearts, and to reject well-supported assertions that rub us the wrong way. We are all prone to this error, but one hallmark of an educated person is the capacity to recognize and compensate for it, at least to some degree. 

We also live increasingly in an age in which intuition is prized above logical analysis. Even some prominent science writers (Horgan, 2005) and prominent scientists (Diamond, 2017) have maintained that “common sense” should be accorded more weight in the adjudication of scientific claims than it presently receives. Such endorsements of everyday intuition are surprising, not to mention ahistorical, given overwhelming evidence that commonsense reasoning often contributes to grossly mistaken conclusions (Chabris and Simons, 2011; Kahneman, 2011; Lilienfeld, 2010). As one example, the history of medicine is a much-needed reminder that excessive reliance on gut hunches and untutored behavioral observations can cause scholars and practitioners to embrace ineffective and even harmful interventions (for a horrifying example, see Scull, 2007). Indeed, according to many medical historians, prior to 1890 the lion’s share of widely administered physical interventions were worthless and in some cases iatrogenic (Grove and Meehl, 1996). 
These trends are deeply concerning in today’s age of virtually nonstop data flow via social media, email, cable television, and the like. We live not merely in an information age, but in a misinformation age. In 1859 the author and preacher C. H. Spurgeon famously wrote that “A lie will go round the world while truth is pulling its boots on” (in a curious irony, this quote has itself been widely misattributed to Mark Twain). If Spurgeon’s dictum was true in 1859, it is all the more true in today’s world of instantaneous information transfer across the world. Not surprisingly, the levels of many pseudoscientific beliefs among the general population are high, and may well be increasing…
...The present book is a superbly edited presentation of valuable lessons regarding the application of scientific thinking to the evaluation of potentially pseudoscientific and otherwise questionable claims. In the section to follow, I attempt to distill these lessons into ten user-friendly take-home messages. It may strike some readers as odd to begin a book with a list of lessons, given that such lists usually crop up in a book’s afterword or postscript. But I hope to provide readers with a roadmap of sorts, pointing them to integrative principles to bear in mind while consuming this book’s diverse chapters. In generating these lessons, I have drawn in part on the chapters of this volume, and in part on the extant literature on cognitive biases and scientific thinking.

(1) We are all biased. Yes, that includes you and me. Some evolutionary psychologists maintain that certain biases in thinking are the products of natural selection (Haselton and Buss, 2000). For example, under conditions of uncertainty we are probably evolutionarily predisposed toward certain false positive errors (Shermer, 2011). When walking through a forest, we are generally better off assuming that a moving stick-like object is a dangerous snake rather than a twig being propelled by the wind, even though the latter possibility is considerably more likely. Better safe than sorry. Whether or not these evolutionary psychologists are correct, it seems likely that many cognitive biases are deeply ingrained in the human cognitive apparatus.

(2) We are largely unaware of our biases. Research on bias blind spot (Pronin, Lin, and Ross, 2002) demonstrates that most of us can readily identify cognitive biases in just about everyone except for one person— ourselves. As a consequence of this metabias, we often believe ourselves largely immune to serious errors in thinking that afflict others. We are not merely biased; we tend to be blind to our own biases. As a consequence, we are often overconfident of our beliefs, including our false beliefs.

(3) Science is a systematic set of safeguards against biases. Despite what most of us learned in high school, there is probably no single “scientific method”— that is, a unitary recipe for conducting science that cuts across all research domains (McComas, 1996). Instead, what we term “science” is almost certainly an exceedingly diverse, but systematic and finely honed, set of tools that humans have developed over the centuries to compensate for our species’ biases (Lilienfeld, 2010). Perhaps foremost among these biases is confirmation bias, the propensity to selectively seek out, selectively interpret, and recall evidence that supports our hypotheses, and to deny, dismiss, and distort evidence that does not (Nickerson, 1998). As social psychologists Carol Tavris and Elliott Aronson (2007) have observed, science is a method of arrogance control; it helps to keep us honest.

(4) Scientific thinking does not come naturally to the human species. As many authors have noted, scientific thinking is unnatural (McCauley, 2011). It needs to be acquired and practiced assiduously. Some authors (e.g., Gilbert, 1991) have even contended that our cognitive apparatus is a believing engine. We believe first, question later (see also Kahneman, 2011). In contrast, some eminent developmental psychologists and science educators have argued that human babies and young children are “natural-born scientists” (e.g.., Gopnik, 2010). True, babies are intellectually curious, seek out patterns, and even perform miniature experiments on the world. But they are not good at sorting out which patterns are real and which are illusory. Moreover, the fashionable view that babies are natural scientists is difficult to reconcile with the fact that science emerged relatively late in human history. According to some scholars, science arose only once in the course of the human species, namely in ancient Greece, not reappearing in full-fledged form until the European enlightenment of the eighteenth century (Wolpert, 1993). Such historical realities are not easily squared with claims that science is part-and-parcel of the human cognitive apparatus.

(5) Scientific thinking is exasperatingly domain-specific. Findings in educational psychology suggest that scientific thinking skills generalize slowly, if at all, across different domains. This point probably helps to explain why it is so difficult to teach scientific thinking as a broad skill that can be applied to most or all fields (Willingham, 2007). This sobering truth probably also helps to explain why even many Nobel Prize winners and otherwise brilliant thinkers can easily fall prey to the seductive sway of pseudoscience. Consider Linus Pauling, the American biochemist and two-time Nobel Prize winner who became convinced that orthomolecular therapy, involving ingestion of massive doses of vitamin C, is an effective treatment for schizophrenia, cancer, and other serious maladies (see Lilienfeld and Lynn, 2016, and Offit, 2017, for other examples). We should remind ourselves that none of us is immune to the temptations of specious claims, particularly when they fall well outside of our domains of expertise.

(6) Pseudoscience and science lie on a spectrum. As I noted earlier, there is almost surely no bright line distinguishing pseudoscience from science. Like many pairs of interrelated concepts, such as hill versus mountain and pond versus lake, pseudoscience and science bleed into each other imperceptibly. My campus at Emory University has a modestly sized body of water that some students refer to as a large pond, others as a small lake. Who’s right? Of course, there’s no clear-cut answer. The pseudoscience-science distinction is probably similar. Still, as I have pointed out, the fact that there is no categorical distinction between pseudoscience and science does not mean that we cannot differentiate clear-cut exemplars of each concept. Just as no one would equate the size of a small pond in a local city park with the size of Lake Michigan, few of us would equate the scientific status of crystal healing with that of quantum mechanics.

(7) Pseudoscience is characterized by a set of fallible, but useful, warning signs. Some contributors to this edited volume appear to accept the view that pseudoscience is a meaningful concept, whereas others appear not to. Following the lead of the philosopher of science Larry Laudan (1983), the latter authors contend that, because the effort to demarcate pseudoscience from science has failed, there is scant substantive coherence to the pseudoscience concept. My own take, for what it is worth, is that pseudoscience is a family-resemblance concept (see also Pigliucci and Boudry, 2013) that is marked by a set of fallible, but nonetheless useful, warning signs. Such warning signs differ somewhat across authors, but often comprise an absence of self-correction, overuse of ad hoc maneuvers to immunize claims from refutation, use of scientific-sounding but vacuous language, extraordinary claims in the absence of compelling evidence, overreliance on anecdotal and testimonial assertions, avoidance of peer review, and the like (Lilienfeld, Lynn, and Lohr, 2014). Despite their superficial differences, these warning signs all reflect a failure to compensate for confirmation bias, an overarching characteristic that sets them apart from mature sciences.

(8) Pseudoscientific claims differ from erroneous claims. Intuitively, we all understand that there is a fundamental difference between fake new and false news. The latter is merely incorrect, and typically results from the media getting things wrong. In contrast, the former is deceptive, often intentionally so. Similarly, many and arguably most assertions in science are surely erroneous, but that does not render them pseudoscientific. Instead, pseudoscientific claims differ from incorrect scientific claims, and in many ways are far more pernicious, because they are deceptive. Because they appear at first blush to be scientific, they can fool us. To most untrained eyes, they appear to be the real thing, but they are not.

(9) Scientific and pseudoscientific thinking are cut from the same basic psychological cloth. In many respects, this is one of the most profound insights imparted by contemporary psychology. Heuristics— mental shortcuts or rules of thumb— are immensely valuable in everyday life; without them, we would be psychologically paralyzed. Furthermore, in most cases, heuristics lead us to approximately correct answers. For example, if three people wearing masks and wielding handguns break into a bank and tell all of us to drop to the floor, we would be well advised to rely on the representativeness heuristic, the principle that like goes with like (Tversky and Kahneman, 1974). By doing so, we would conclude that because these individuals resemble our prototype of bank robbers, they are probably bank robbers. In fact, the invocation of the heuristic in this case and others is not only wise, but usually correct. Still, when misapplied, heuristics can lead to mistaken conclusions. For example, many unsubstantiated complementary and alternative medical remedies draw on the representativeness heuristic as a rationale for their effectiveness (Nisbett, 2015). Many companies market raw brain concentrate in pill form to enhance memory and mood (Gilovich, 1991). The reasoning, apparently, is that because psychological difficulties stem from an inadequately functioning brain, “more brain matter” will somehow help the brain to work better.

(10) Skepticism differs from cynicism. Skepticism has gotten a bad rap in many quarters, largely because it is commonly confused with cynicism. The term “skeptic” derives from the Greek word “skeptikos,” meaning “to consider carefully” (Shermer, 2002). Skepticism requires us to keep an open mind to new claims but to insist on compelling evidence before granting them provisional acceptance. In this respect, skepticism differs from cynicism, which implies a knee-jerk dismissal of implausible claims before we have had the opportunity to investigate them carefully (Beyerstein, 1995). In fairness, some individuals in the “skeptical movement” have at times blurred this crucial distinction by rejecting assertions out of hand. Skeptics need to be on guard against their propensities toward disconfirmation bias, a variant of confirmation bias in which we reflexively reject assertions that challenge our preconceptions (Edwards and Smith, 1996).

If readers keep the foregoing ten lessons in mind while reading this volume, they should be well equipped to navigate their way through its stimulating chapters and their broader implications. These lessons should also remind readers that we are all susceptible to questionable claims, and that science, although hardly a panacea, is ultimately our best bulwark against our own propensities toward irrationality.

Kaufman, Allison B.; Kaufman, James C. (2018-01-26). Pseudoscience: The Conspiracy Against Science (MIT Press) (Kindle Locations xi-313). The MIT Press. Kindle Edition.
Yeah. Not exactly a new concern. Below, a book in my stacks since 1996.


Also, sitting right behind that one on the shelf (circa 1991).


One of my favs. I think I gave away my original hardbound edition. Have it paperback now.

Ahhh... what do I know? "I am not a scientist."

apropos of the topic broadly, I know you would enjoy this book:


An "ology" of error. I'd cited it here several years ago. Kathryn Schulz rocks!

Oh, and, one more thing.


See last year's post citing it, "Just the facts..." Also,

Another recommendation

A PRIOR CITE OF A BOOK REVIEWED AT SCIENCE BASED MED

I previously mentioned this book:


From the SBM review,
The Ethics of CAM: More Harm than Good?
A new book examines the ethics of Complementary and Alternative Medicine (CAM). Ernst and Smith demonstrate that CAM is inherently unethical and does more harm than good.


Edzard Ernst is arguably the world’s foremost expert on the claims and the evidence (or lack thereof) for Complementary and Alternative Medicine (CAM). Now he has teamed up with a medical ethicist, Kevin Smith, to co-author a new book, More Harm than Good? The Moral Maze of Complementary and Alternative Medicine. Much has been written on CAM, but this book takes a new approach. It asks if CAM is ethical, and answers with a resounding “No.”...
Goes to scientific thinking, which ethically requires, among other attributes, "competence." From the book:
1. Clinical Competence

To what extent can practitioners and promoters of CAM be relied upon to practice or recommend safe and effective treatments? This is the question that we shall consider in this chapter.

The idea that it is morally required for healthcare professionals to practice safe and effective forms of therapy— in other words, to be competent— is self-evidently correct and incontestable. The ethical reasons for practitioner competence include:

a. To avoid patients being harmed by unsafe therapies;  
b. To avoid patients being harmed through failing to benefit from the most effective therapies available;  
c. To avoid harm to patients through the promotion of a general belief in ‘alternatives’ to proven forms of medicine.
Thus, avoidance of harm is the main ethical rationale underpinning the presumption that competence is an ethical requirement. To prevent harm, there exists a moral imperative on those practicing or recommending any form of healthcare to ensure that their knowledge is thorough and up-to-date. More fundamentally, this knowledge must be scientifically and logically valid. Moreover, it can never be sufficient for healthcare practitioners to merely act in good faith: regardless of how sincerely a false medical belief is held, the agent who acts— however honestly— on such a belief is liable to become the subject of justifiable moral opprobrium. For example, consider a religiously motivated physician who insists in treating his patients by intercessory prayer . This physician is behaving in an ethically reprehensible manner, and the fact that he truly believes in prayer as the best form of therapy does not justify this practice, nor does it excuse him morally.

Regrettably, the reality is that many CAM proponents allow themselves to be deluded as to the efficacy or safety of their chosen therapy, thus putting at risk the health of those who heed their advice or receive their treatment…

1 What Does ‘Competent’ Mean?

Mean? Most CAM practitioners believe themselves to be competent. This raises some important questions: amongst CAM advocates and practitioners, what is competence taken to mean?...


Ernst, Edzard; Kevin Smith. More Harm than Good?: The Moral Maze of Complementary and Alternative Medicine (Kindle Locations 543-562). Springer International Publishing. Kindle Edition.
Prior to this material, the authors provide a very good summary of the principal schools of "ethical" reasoning.
Introduction to Medical Ethics

This introduction is aimed at those readers who are unfamiliar with the principles, frameworks and approaches used in medical ethics. Those already familiar with the basic ethical concepts used in medicine are invited to ignore this section and instead turn directly to Chapter 1.

Medical ethics is a scholarly discipline and like all academic areas contains its own language and theory. A number of formal theoretical frameworks and principles are utilised by medical ethicists in their analyses of ethical problems. However, in this book our ethical points will be based on straightforward argumentation wherever possible. We will refer to formal academic ethical theory only where necessary and always avoid impenetrable or abstruse theory.

However, since ethical frameworks and principles form the substance of professional ethical discourse, it will be useful for the reader to have a broad understanding of these. And, while avoiding undue reliance on formal ethical theory, we will where appropriate utilise theoretical approaches to help analyse specific ethical issues that arise in CAM.

Throughout this book, our ethical considerations of the issues raised by CAM will be based on an ethical approach known as utilitarianism. This ethical framework, which seeks to evaluate the consequences of behaviours and decisions in medicine, is explained in some depth below. Utilitarianism is frequently employed in medical ethics; however, it is not the only approach available or defendable. We think it will be valuable for readers to have some knowledge of the other major ethical approaches that are used in medicine, since this will provide context for the utilitarian approach, and help to show how the other major ethical approaches reach broadly the same basic conclusions about the ethics of CAM as arrived at by utilitarian reasoning.

Thus, we set out below the major approaches used in medical ethics. (It is worth noting that these approaches are also employed in ethics more generally, not only in the domain of medicine.)
[ibid, Kindle Locations 51-66]
Given that my 1998 grad degree is in "Ethics and Policy Studies," I found all of this intrinsically interesting and mostly a quite familiar "refresher course."

All of the foregoing comprise shots below the waterlines of "The unsinkable rubber duckies of pseudoscience." Highly recommended readings.

CODA


My own painful experience dealing with medical pseudoscience goes back to the days of my late elder daughter's illness. From my "One in Three" essay:
'Arrogant, narrow-minded, greedy, and indifferent?'
Is science the enemy? To the extremist "alternative healing" advocate, the answer is a resounding 'yes'! A disturbing refrain common to much of the radical "alternative" camp is that medical science is "just another belief system," one beholden to the economic and political powers of establishment institutions that dole out the research grants and control careers, one that actively suppresses simpler healing truths in the pursuit of profit, one committed to the belittlement and ostracism of any discerning practitioner willing to venture "outside the box" of orthodox medical and scientific paradigms…
Twenty years later, I keep having to fend off the same bullshit while dealing with Danielle's illness.
_____________

More to come...

No comments:

Post a Comment