Search the KHIT Blog

Friday, March 31, 2023

A bad week in the U.S.

 
Yeah, it's "unprecedented." 
 
So was the 2016 presidential election of a lifelong con-man, grifter, and white collar criminal with zero experience in public office and a thermonuclear ego. And, that is what got us here.
…The indictment in New York will deepen the conviction among some loyalists that Trump is a victim of the “deep state” and even more deserving of their devotions than before. Trump, of course, does not care what bonfire he lights. “For those who have been wronged and betrayed,” he said recently, “I am your retribution.” Trump is practically promising trouble. Any judge who encounters him in court—in New York or beyond—will be hard-pressed to prevent the former President from issuing threats of violence.

So, yes, many of Trump’s most ardent supporters will be enraged by this indictment and any others. There is no telling if that will lead to trouble in the streets. But will it win Trump back the Presidency? Will his getting arrested in the middle of a national campaign really win back suburban voters and independents who voted for him in 2016 but abandoned him in 2020 for Joe Biden? How many voters will say, in effect, Yes, I was deeply embarrassed and ashamed by the spectacle of January 6th, but now having seen Donald Trump in the dock and hearing the resonant phrases “porn star,” “catch and kill,” and “hush money”—much less “incitement to insurrection” and “election fraud”—I am summoned back to the maga fold? Once more, the American political imagination reels and the stakes for American democracy could not be higher. 
[ David Remnick ]

Right.
 
The escalating vituperative calls for lethal armed MAGA "civil war" violence are burning up the dark corners of the internet. And, also being (plausible deniability) dog-whistled by GOP members in Congress.

Arraignment is scheduled for Tuesday in NYC. It's gonna be a Level 5 shitshow. I hope no one is injured or killed, but I would hardly be surprised.

More David Remnick

Trump is, if anything, more unhinged than he was in his final days in the White House. At the recent CPAC convention his speech had a quality of wildness that made the ranting nativism of his 2016 inaugural seem as mild as the murmurings of Martin Van Buren or Warren G. Harding. As you read, remember this is the leading candidate for the Republican nomination:

The sinister forces trying to kill America have done everything they can to stop me, to silence you, and to turn this nation into a socialist dumping ground for criminals, junkies, Marxists, thugs, radicals, and dangerous refugees that no other country wants. No other country wants them. If those opposing us succeed, our once beautiful U.S.A. will be a failed country that no one will even recognize. A lawless, open borders, crime-ridden, filthy, communist nightmare. That’s what it’s going and that’s where it’s going. . . . That’s why I’m standing before you, because we are going to finish what we started. We started something that was a miracle. We’re going to complete the mission. We’re going to see this battle through to ultimate victory.
Another bad week may be drawing nigh. And, on top of all this, these tornadoes. Lordy mercy.

TRUMP AT HIS POTUS OUTSET
Addressing the CIA.
Addressing the Boy Scouts. 
 
OH, ONE MORE THING:
__________
 

Wednesday, March 29, 2023

On "Data"

This data is ... These Data ARE


The Amazon blurb:
A sweeping history of data and its their technical, political, and ethical impact impacts on our world.

From facial recognition—capable of checking people into flights or identifying undocumented residents—to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn’t just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search.

Expanding on the popular course they created at Columbia University, Chris Wiggins and Matthew L. Jones illuminate the ways in which data has have long been used as
a tool tools and a weapon weapons in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was were created and curated, as well as how new mathematical and computational techniques developed to contend with that those data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom?

Wiggins and Jones focus on these questions as they trace data’s historical arc, and look to the future. By understanding the trajectory of data—where
it they has have been and where it they might yet go—Wiggins and Jones argue that we can understand how to bend it them to ends that we collectively choose, with intentionality and purpose.
Yeah, I know. I've pretty much lost that "data are" fight. Just shows my age and pedantic irascibility. Perhaps this was just from an Amazon copywriter.
 
In the early 1990s, in the wake of more than 5 years as a programmer and SPC analyst in an Oak Ridge radioanalytical lab prior to moving to Las Vegas, I served a tenure as Technical Editor/Writer for a digital industrial diagnostics firm in West Knoxville, TN (portable FFT analyzers and related hardware and software products). I would routinely get tech paper drafts from the engineers and promptly change all the "data is" stuff to the proper "data are." My good-ole-boy redneck boss Forrest, our crew-cut mechanical engineer VP of marketing, would then red-pen my corrections back to "data is."

"It just looks wrong."

Yeah. Too bad.

I would quietly then change his edits right back, and my final cuts went off to production.

LOL: He once took issue with some of my own copy, and forbade me from henceforth using the words "affixed" and "atop"— e.g., "With the remote sensors firmly affixed atop the turbine housings..."

"They just look faggy."—Forrest.

Groan. Whatever. And, data ARE.
 
Letter to the editor, Science Magazine, April 8, 1927

Lordy.

Will let'cha know what I find in the book.

HOW DATA HAPPENED

Came onto this book via a Jill Lepore article in The New Yorker: "The Data Delusion."
…[I]imagine that all the world’s knowledge is stored, and organized, in a single vertical Steelcase filing cabinet. Maybe it’s lima-bean green. It’s got four drawers. Each drawer has one of those little paper-card labels, snug in a metal frame, just above the drawer pull. The drawers are labelled, from top to bottom, “Mysteries,” “Facts,” “Numbers,” and “Data.” Mysteries are things only God knows, like what happens when you’re dead. That’s why they’re in the top drawer, closest to Heaven. A long time ago, this drawer used to be crammed full of folders with names like “Why Stars Exist” and “When Life Begins,” but a few centuries ago, during the scientific revolution, a lot of those folders were moved into the next drawer down, “Facts,” which contains files about things humans can prove by way of observation, detection, and experiment. “Numbers,” second from the bottom, holds censuses, polls, tallies, national averages—the measurement of anything that can be counted, ever since the rise of statistics, around the end of the eighteenth century. Near the floor, the drawer marked “Data” holds knowledge that humans can’t know directly but must be extracted by a computer, or even by an artificial intelligence. It used to be empty, but it started filling up about a century ago, and now it’s so jammed full it’s hard to open.

From the outside, these four drawers look alike, but, inside, they follow different logics. The point of collecting mysteries is salvation; you learn about them by way of revelation; they’re associated with mystification and theocracy; and the discipline people use to study them is theology. The point of collecting facts is to find the truth; you learn about them by way of discernment; they’re associated with secularization and liberalism; and the disciplines you use to study them are law, the humanities, and the natural sciences. The point of collecting numbers in the form of statistics—etymologically, numbers gathered by the state—is the power of public governance; you learn about them by measurement; historically, they’re associated with the rise of the administrative state; and the disciplines you use to study them are the social sciences. The point of feeding data into computers is prediction, which is accomplished by way of pattern detection. The age of data is associated with late capitalism, authoritarianism, techno-utopianism, and a discipline known as data science, which has lately been the top of the top hat, the spit shine on the buckled shoe, the whir of the whizziest Tesla…
Read all of it.
 
BTW: "You are smarter than your data." Selected prior KHIT blog riffs.

Stay tuned...
__________
 

Friday, March 24, 2023

"God SAID it; I BELIEVE it; that SETTLES it."

Just finished another excellent read.

Click cover image
Of late, observers of the U.S. political landscape have been commenting more and more on the alarming ways in which Americans of different political persuasions and cultural, racial, and other identity groups seem not just to disagree on issues but also to be living in different realities. One area where this situation has significant consequences is in the way people can interpret reports of scientific consensus differently, depending on their prejudices and allegiances. Different people, for example, may hear about the science on the human causes of climate change and—sincerely—perceive either certainty, uncertainty, or outright hoax. This phenomenon undercuts public discourse on matters where public policy grounded in solid science has never been more essential.

This phenomenon is on a continuum with the way in which different people can look at those living in poverty, and see them either as victims of unfair circumstances or as people who are complicit in a culture of irresponsibility and dependency. Different people will consider a given refugee population, and see either an alien threat to our way of life or deserving potential members of our society. Different people will see a video of a police shooting; some will see justification and others will see murder.

An environment of polarization, prejudice, bias, and willful self-deception, combined with an often misleading political and media environment, is toxic for political discourse. Polarization on matters of fact is affecting progress on matters of critical public importance, such as action on climate.

Research on denial has exploded over just the last few years. This includes game-changing work from social, political, cognitive, and evolutionary psychology, as well as from sociology, communication studies, political science, history, and philosophy. My goal has been to bring this diverse work together for the reader while, I hope, convincing readers of the urgent importance of gaining a better understanding of unconscious bias and self-deception. Denial concerns all of us—both as victims and as perpetrators—and so this work is intended not just for an academic audience; it is for everyone.

Adrian Bardon. The Truth About Denial. Oxford University Press. Kindle Edition. 

The author nailed it. Great scholarly writing style, leavened with numerous LMAO moments. 

File this one under "Deliberation Science." Also, see "The Big Myth." And, "Trust The Plan." Lots to ponder.
The purpose of this book is to examine the pervasive human tendency to deny uncomfortable truths and to discuss how this tendency affects public discourse—as well as private life—on an exceedingly wide range of important topics. The phenomenon of denial, as we shall see, is dependent on motivated cognition. “Motivated cognition” refers to the “unconscious tendency of individuals to process information in a manner that suits some end or goal extrinsic to the formation of accurate beliefs.” Motivated cognition happens behind the scenes, but is closely tied to the more overt rationalization of belief, which I shall define as the process of retroactively inventing defensive justifications for holding those beliefs formed via motivated cognition. Motivated cognition is about belief formation, whereas rationalization is about maintaining and defending beliefs. Rationalization is thus a kind of second stage for motivated cognition. Unlike motivated cognition, explicit rationalization is a conscious process, though we are often not consciously aware of our motives when we engage in it. (I shall use the familiar phrase motivated reasoning—the popular use of which doesn’t generally distinguish between initial motivated cognition and the second-stage rationalization of that way of thinking—to denote the whole process wherein implicit, motivated cognition is followed by the generation of spurious reasons to maintain those sincerely held beliefs formed via motivated cognition.)

Let’s get a little clearer on exactly what “denial” does and does not include, for purposes of this discussion. It does not refer, for instance, simply to being misinformed. I wish to examine denial strictly in that sense of being “in denial” wherein the denier is exhibiting a kind of emotionally self-protective self-deception. (Denial is often misattributed to ignorance; as I shall discuss further, there is good reason to think that the real issue is motivated reasoning.) Denial, in this context, presumes some exposure to relevant—and unwelcome—facts and constitutes a kind of reaction to them. This sort of self-deception is different from mendacity, wherein one purposefully lies to others about the existence of evidence for something, or deliberately misrepresents the evidence. One might know perfectly well, for example, that one’s oil company is responsible for a toxic spill, and respond by actively and consciously engaging in a cover-up and public denial of responsibility.

Neither am I talking about “spin,” or what philosopher Harry Frankfurt has termed bullshit. The bullshitter’s intent is not to lie but, rather, to influence or to create a certain reality, and is simply indifferent as to whether his or her claims are true or false. The job of the trial attorney, the political operative, or the commercial advertiser is neither to uphold the truth nor to lie; rather, the job is to represent one’s client in the best light possible.

Being in denial is also to be distinguished from wishful thinking. What wishful thinking has in common with denial is that each fulfills an emotional need of some kind. However, with wishful thinking, there is a belief without solid evidence for a conclusion one way or the other…
[Adrian Bardon, pp. 4-5].
CONTENTS
 
Preface
Acknowledgments

1. Bias and Belief
1.1 What Is Denial?
1.2 “Hot Cognition”
1.3 Mechanisms of Motivated Reasoning
1.4 The Origins of Denial
1.5 Pathological Ideology, and Denialism as a Social Phenomenon
2. Science Denial
2.1 Climate Science Denial
2.2 Personality, System, and Status Quo
2.3 The Asymmetry Thesis
2.4 Science and Societal Change
3. Pride, Prejudice, and Political Economy
3.1 Political Economics
3.2 Poverty and the Fundamental Attribution Error
3.3 Classism and Racial Stereotyping
3.4 The Liberty Argument
3.5 Asymmetry Again
4. Religion
4.1 Reasons to Believe
4.2 Is It Even a Belief?
4.3 The Origins of Religiosity
4.4 Tenacity
4.5 The Problem(s) with Religion
4.6 The Retreat into Abstraction
Afterword: Directions for Science Communication
A.1 More Information?
A.2 Message Framing and Delivery
Index
 
You might find some of his conclusions a bit counterintuitive at first blush, but his evidence is fairly compelling. Uhhh... "Evidence?" Define "evidence." 
I have by now studied and reviewed dozens of books on the operational sciences, philosophy of science, science communication, etc. on this blog, and while the word “evidence“ has shown up thousands of times thus far (I always do searches of keywords and phrases), I have yet to encounter a writer providing a definition of the term. I find that sadly telling. We mostly tend to assume that we all have the same general understanding of key words. Adrian, to his credit, at least points out that, on hot-button issues, partisans tend to deploy opposing ideological interpretations of key definitions. These tangential wafts of “No True Scotsman“ fallacy just muck things up. 
Stay tuned.
_____

ON DECK


This was reviewed in my latest Science Magazine—"Achieving Cognitive Liberty"
In her latest book, The Battle for Your Brain, neuroethicist and law professor Nita Farahany warns readers that neurotechnology—technology designed to monitor or manipulate the human nervous system—can “either empower or oppress us.” Farahany generously illustrates how such technologies, which range from monitoring tools such as functional magnetic resonance imaging and the electroencephalogram (EEG) to techniques that can alter brain function, such as deep brain stimulation and transcranial magnetic stimulation, are affecting our lives. But The Battle for Your Brain is, above all, a call to expand human rights to include “cognitive liberty,” a right articulated in 2004 by Wrye Sententia as our “autonomy over [our] own brain chemistry”…
 
Farahany ends her analysis by inviting readers to join the debate about the benefits and risks of various transhumanist proposals. These include postmortem brain cryopreservation, the expansion of human senses through brain–computer interfaces, brain-to-brain communication, brain-to-text messaging, and the use of brain implants to inactivate pain and suffering.

In The Battle for Your Brain, Farahany calls for “prudent vigilance and democratic deliberation” regarding the social repercussions of neurotechnology. The book is valuable reading, not only for those interested in neuroscience but also for anyone genuinely concerned about the challenges humanity will face in the near future.

I'm confident that this too will be excellent.


Hmmm…Electrome,” anyone? See also "Data Driven."
 
UPDATE
Click the article.
UPDATE
Neurotech’s Battles Impact Our Brains’ Future
Mental sovereignty, says author Nita Farahany, is no longer a given
BTW: I finished her book today (Mar 28th). A lot to assimilate. More to come...
________
 

Monday, March 20, 2023

Life at the Cellular Level


Yeah, there's all this wild media speculation, much of it deliberately driven by Donald Trump himself. Indictment in NYC imminent? In Georgia? I guess we'll soon know. I remain skeptical that he'll ever see the inside of a jail cell, notwithstanding any eventual conviction(s) on the numerous serious violations he's accused of.


Whatever. I'm in Las Vegas this week to hang with family and friends. Below, my favorite restaurant (at D.I. and Eastern).

My new iPhone 14 Max Pro optics are killer.
 
VEGAS UPDATE
 
Went to see my pals' band at The Copa Room.
 

We have a long history. Miss these peeps.
 
Back to Baltimore in the morning.
__________
 

Friday, March 17, 2023

Silicon Valley Bank


Sniff... They were Making The World a Better Place.

Let us rewind to a July 2017 KHIT post...

First, from my fav, the insanely over-the-top, (painfully and scatalogically) hilarious Silicon Valley HBO.

After watching that entire 4-season series (many episodes numerous times), I still now have doubts I can continue to cover Health IT sector events with a straight face.
______
 
I covered the Healthcare Startup Space for a number of years. Mostly, but not exclusively Silicon Valley/Bay Area-ish.
 
Yeah, "making the world a better place." For some people, 'eh? 


UPDATE

 CURRENT READING UPDATE


The Nicholas Humphrey book came to me via a New Yorker article. Adrian Bardon's book via a 1-star negative Amazon review of another book covering similar topics. I tweeted about the Humphrey book (I'm about 60% through it at the moment).


Stay tuned...
__________
 

Tuesday, March 14, 2023

“I AM YOUR RETRIBUTION”

Donald Trump, CPAC 2023 Keynote address
 
 @2:32, "I am your retribution."
_____
Peter Wehner, writing in The Atlantic.
Revenge creates a cycle of retaliation. It “keeps wounds green, which otherwise would heal,” in the words of Francis Bacon. Vengeance is insatiable, and in any society, over the long term, it can be deeply damaging. The desire for revenge reduces the capacity for legislators to work together across the aisle. It creates conditions in which demagogues can successfully peddle conspiracy theories and call for a “national divorce.” It leads Americans to see members of their opposing party as traitors. And exacting revenge tempts people to employ immoral and illegal methods—street violence, coups, insurrections—they would not otherwise contemplate. (The defamation lawsuit against Fox News by Dominion Voting Systems revealed that a Fox producer texted Maria Bartiromo, a Fox news anchor, saying, “To be honest, our audience doesn’t want to hear about a peaceful transition.”)..

Human emotions can be dominant and even determinative in distorting and deforming people’s judgments. Individuals who honestly believe that the Bible is authoritative in their lives—who insist that they cherish Jesus’s teachings from the Sermon on the Mount (blessed are the meek, the merciful, the peacemakers, and the pure in heart; turn the other cheek; love your enemies) and Paul’s admonition to put away anger, wrath, slander, and malice and replace them with compassionate hearts, kindness, humility, meekness, patience, a spirit of forgiveness, and, above all, love, “which binds everything together in perfect harmony”—find themselves embracing political figures and a political ethic that are antithetical to these precepts. Many of those who claim in good faith that their Christian conscience required them to get passionately involved in politics have, upon doing so, discredited their Christian witness. Jesus has become a “hood ornament,” in the words of the theologian Russell Moore, in this case placed atop tribal and “culture war” politics…

The antidote to the politics of retribution is the politics of forbearance. Forbearance is something of a neglected virtue; it is generally understood to mean patience and endurance, a willingness to show mercy and tolerance, making allowances for the faults of others, even forgiving those who offend you. Forbearance doesn’t mean avoiding or artificially minimizing disagreements; it means dealing with them with integrity and a measure of grace, free of vituperation.

None of us can perfectly personify forbearance, but all of us can do a little better, reflect a bit more on what kind of human beings and citizens we want to be, and take small steps toward greater integrity. We can ask ourselves: What, in this moment, is most needed from me and those in my political community, and perhaps even my faith community? Do we need more retribution and vengeance in our politics, or more reconciliation, greater understanding, and more fidelity to truth?

The greatest embodiment of the politics of forbearance was Abraham Lincoln. With a Civil War looming, he was still able to say, in his first inaugural address, “We are not enemies, but friends. We must not be enemies. Though passion may have been strained, it must not break our bonds of affection.”

Those bonds were broken; the war came. By the time it ended, more than 700,000 lives had been lost in a nation of 31 million. But the war was necessary; Lincoln preserved the Union and freed enslaved people. And somehow, through the entire ordeal, Lincoln was free of malice. He never allowed his heart to be corroded by enmity or detestation…
Jesus has become a “hood ornament,” in the words of the theologian Russell Moore, in this case placed atop tribal and “culture war” politics.
 
Indeed.
apropos,
MY LATEST READ
Click cover image
 
Yeah. Again, though, "Why Do Humans Reason?"
 
To "win" the argument. Should truth happen along the way, so much the better. All goes to my recurrent "Deliberation Science" riffs.
 
__________
 

Wednesday, March 8, 2023

"We Are Electric"

My newest read.
 
Most afternoons, my wife and I ("Meee-mo & Pop") drive down to Fells Point in Baltimore to pick up Calvin, our 3-yr old grandson, at his pre-school. (Below, Calvin singing "Happy Birthday" to me on his dad's computer last month.)
 
 
We typically have BBC World Service on via our NPR station. Today, on the way back to the house, I switched over to "Fresh Air, with Terry Gross."
 
Terry's guest interviewee was Sally Adee, discussing her new book We Are Electric.
Writer Sally Adee says scientists are looking into ways to manipulate the body's natural electrical fields to try and treat wounds, depression, paralysis, and cancer.
Bought & downloaded it straight away. Will start reading after NIck comes to pick Calvin up after he gets off work.
Click cover image
Amazon blurb:
Science journalist Sally Adee breaks open the field of bioelectricity—the electric currents that run through our bodies and every living thing—its misunderstood history, and why new discoveries will lead to new ways around antibiotic resistance, cleared arteries, and new ways to combat cancer.

You may be familiar with the idea of our body's biome: the bacterial fauna that populate our gut and can so profoundly affect our health. In 
We Are Electric we cross into new scientific understanding: discovering your body's electrome.

Every cell in our bodies—bones, skin, nerves, muscle—has a voltage, like a tiny battery. It is the reason our brain can send signals to the rest of our body, how we develop in the womb, and why our body knows to heal itself from injury. When bioelectricity goes awry, illness, deformity, and cancer can result. But if we can control or correct this bioelectricity, the implications for our health are remarkable: an undo switch for cancer that could flip malignant cells back into healthy ones; the ability to regenerate cells, organs, even limbs; to slow aging and so much more. The next scientific frontier might be decrypting the bioelectric code, much the way we did the genetic code.

Yet the field is still emerging from two centuries of skepticism and entanglement with medical quackery, all stemming from an 18th-century scientific war about the nature of electricity between Luigi Galvani (father of bioelectricity, famous for shocking frogs) and Alessandro Volta (inventor of the battery).

In 
We Are Electric, award-winning science writer Sally Adee takes readers through the thrilling history of bioelectricity and into the future: from the Victorian medical charlatans claiming to use electricity to cure everything from paralysis to diarrhea, to the advances helped along by the giant axons of squids, and finally to the brain implants and electric drugs that await us—and the moral implications therein. 

The bioelectric revolution starts here.
Stay tuned. All sounds fascinating.

Interview transcript at the Fresh Air page.

UPDATE
 
I'm well into the book. Lovely writing style. NY Times book review here. to wit:
"The electrome is an entity — with its ion-driven microvoltages all now measurable — that is, quite literally, immaterial. The divinely-minded will be tempted to conceptualize the electrome as the human soul. But Adee has no truck with such fancies. Soul or not, though bioelectricity weighs nothing it can do fantastic things. Adee knows; she has read for our benefit what seems like the entire history of bodily battery power — especially the delicious 18th-century tussle between the SignoriVolta and Galvani, in the matter of the twitching of frogs’ legs. She has also slogged through all the later research papers on electricity-related cellular biology. And all of this eventually led her into the long grass of some mightily weird modern research."
Yep. Pass the cognitive popcorn.

ECT

This in particular caught my eye in the above NY Times review (his opening paragraph):


My late mother was seriously clinically bipolar across her entire life (it ran in her family). Prone to recurrent incapacitating "nervous breakdowns," she was ECT'd repeatedly. Those tx just made her angrier and angrier and angrier.

SATURDAY UPDATE: CUTTING TO THE CHASE

Finished the book. I found it riveting. The author concludes:
...The dream is that all these tools and the insights they unlock will usher in a future of interfaces that work with—and possibly improve—biology on its own terms.

For the past half century, that honor has gone to machines and engineers who promise a future of all-knowing artificial intelligence, cyborg upgrades to what some have taken to dismissing as our inferior “meat bodies,” and a transhumanist deep future in which all biological matter has upgraded to silicon. But recently, the shine has begun to come off AI as we realize just how limited silicon intelligence really is. Existing materials can’t even manage hip implants that last longer than ten years—so how are we meant to have a permanent telepathic neural device attached to our brain? The research now underway in bioelectricity suggests that, rather than grasping for silicon and electron replacements to biology, the answers to an upgraded future might lie in biology itself.

Many of the early pioneers of bioelectricity have been redeemed after being initially ignored or derided. This is true not only of Galvani but also of Harold Saxton Burr, whose predictions about cancer and development have been validated with time, just as Galvani was right about the spark of life. Burr’s individual ideas seem to have been broadly right—but in his book published in 1974, he also tied these experiments into a bigger hypothesis. He posited that the day biology investigates forces instead of only studying particles, it will undergo a conceptual leap to rival the importance of splitting the atom for physics.

But there’s a final question. What then?

When we learned about the microbiome, we learned that we could improve it by eating kimchi and lots of greens. Learning about the electrome isn’t going to yield similar self-help just yet.

Hacking our memories or overclocking ourselves into infinite productivity is still quite a way off, and I hope my book has explained sufficiently why that’s the case. And I hope I’ve convinced you that it’s actually not the right approach anyway…

A lot of ink has been spilled asking where you draw the line between medical intervention and cosmetic enhancement. People raise this question all the time about all kinds of cognitive (and other cosmetic) enhancement, but no one ever seems to come up with a good answer. That’s probably because it’s actually a question that gets more unsettling the closer you peer at it. Because of course the more people adopt any particular enhancement, the more pressure they will exert on the people around them—and on themselves!—to keep up lest they get left behind, and the more unaugmented normalcy is transformed, by process of inertia, into a deficiency. The blame isn’t on any one individual—this is a classic tragedy of the commons.

Particularly in sport, this conversation has been very germane. Discussing tDCS in sport with Outside magazine, Thomas Murray, president emeritus of the Hastings Centre bioethics research institute, told the reporter Alex Hutchinson that “once an effective technology gets adopted in a sport, it becomes tyrannical. You have to use it.” Hutchinson made the dire and completely correct observation that “if the pros start brain-zapping, don’t kid yourself that it won’t trickle down to college, high school, and even the weekend warriors.” Once you start this game, you can never stop.

So my final exhortation for you, the person who has made it all the way to the end of my book, is this: when you see someone trying to sell you this stuff, ask who will benefit. Why is someone trying to sell it to you? Is it really for you? Beyond the basic “were the trials any good?” skepticism, ask what will happen next. Is this something that will alleviate your suffering? Or will it just kick the can down the road because eventually your new normal will become the new substandard, making way for the next piece of enhancing kit? The answer to that question is very different if the intervention is a treatment for cancer versus a way to be a better hustle goblin at your workplace.

In fact, I would love to take this whole idea of the body as an inferior meat puppet to be augmented with metal and absolutely launch it into the sun. Cybernetics keeps dangling the seductive illusion that we can ascend beyond the grubby world of human biology in our cyborg future—cajoled into correct action and good health (and maximum productivity, of course) by the electrical takeover of a couple of relevant nerve terminals.

The study of the electrome shouldn’t serve these masters. Doing the research that led to this book turned my head exactly 180 degrees from this view. Rather than being a collection of inferior meat bodies, biology becomes more astounding the more you learn about it—and fractally complicated too, as the more you learn, the more you realize you don’t understand. We are electrical machines whose full dimensions we have not even yet dreamed of.

But as is evident from the MIT program, academia is waking up to the interconnectedness, and different fields are starting to talk to each other more to explore this electric future. That is where we will start to see the next great steps in bioelectricity.

The real excitement of the field hews closer to the excitement around cosmology—a better understanding of our place in the universe and in nature. Already, some of the findings are upending some conventional wisdom. I honestly can’t wait to see what else we find in the next decade.
[We Are Electric, pp. 292-296]
One helluva writer and science historian.

Interesting callout from the above.


My niece April’s husband, Jeff Nyquist (Neuropsychology PhD, Vanderbilt) is the founder and chief science officer of neurotrainer.com.


I have alerted him to Sally’s book, and would really like to talk over some of this stuff with him.

Below: Cool Radiolab interview with Sally.

 
UPSHOT IMPLICATIONS
 
 
From my current read. Below, apropos of some quick "electrome science" Google digging:

Introduction
In addition to their 3 dimensions in space (x, y, z axes) and their time dimension, living systems also have a self-generated “electrical dimension” and an “immaterial” one that is inherent to their ability to handle (immaterial) information. When dealing with this immaterial dimension of “Life” the term “soul” is widely used, in particular with respect to the transition from “still alive” to “just dead” in humans, and when facing the intriguing question whether or not “the soul” persists one way or another after death. With respect to “consciousness,” a most intriguing key aspect of Life’s immaterial dimension, the consensus seems to be growing that it is a ubiquitous property of all living systems, from bacteria to animals, plants and fungi. But consensus is missing on whether the immaterial dimension of Life ultimately has a biophysical basis or not.

Today, the “material sciences” and the “non-material ones” as they are sometimes referred to, hold very different opinions on these challenging questions. Some protagonists of the latter even claim that “Biology is beyond physical sciences.” Such a statement meets with great skepticism and even blunt rejection among experimentalists. Caetano-Anolles states that such opinions should not be published in scientific journals. But rejection of confrontation does not advance insight and rapprochement.

For both the exact sciences and the humanities the key problem is: “Can something that is commonly thought to be truly immaterial (incorporeal and immaterial in the definition of “Soul” in the Encyclopedia Britannica), thus that has no mass in the meaning of classical physics, exist?” In a first reaction students of the exact sciences may be inclined to say: “No, impossible.” This is reflected in the scientific literature: one will not encounter “soul” with the cited meaning in textbooks of western biology or medicine unless when figuratively speaking. In psychology which, in origin, was the study of the “psyche” or “soul,” not “soul” but attitude and behavior represent its essence

If one searches the vast literature on e.g. consciousness, soul, thoughts etc. it soon becomes apparent that many authors give the impression to be either unaware of, or at least do not attribute any importance to the well documented fact that all cells have an immaterial self-generated electrical dimension, and that it is exactly this electrical activity that disappears at death along with thoughts and consciousness. Concurrently, one also gets confronted with the fact that the exact sciences seem to be reluctant to forward possible ideas for a biophysical underpinning of Life’s immaterial dimension. Is there no common ground between the exact sciences and the humanities, or could it be that we have all overlooked the possibility that what we call “soul” is simply an aspect of this largely undervalued self-generated cellular electrical activity?

The analysis of what exactly changes at the very moment a living system passes from “still alive” to “just not alive anymore” may help to (partially) define the immaterial dimension of Life with a vocabulary from the exact sciences. It may also help to answer the question whether or not only humans but all living systems/communicating compartments, bacteria inclusive have “a similar immaterial dimension, electrical in nature.”

In this paper I will try to demystify the idea that understanding the immaterial dimension of Life is beyond the capacity of the human brain, and that therefore one should not even spend efforts in lifting the veil. Some of its aspects are normal properties of information handling in sender-receiver systems that need self- generated electric/ionic currents for their functioning. Yet, I fully agree that the full picture, namely the full understanding of what thoughts and consciousness etc. are will not be possible earlier than the biophysical and biochemical principles governing the cognitive memory system will have been uncovered. In all honesty, I do not expect this to happen in the near future.

An unambiguous definition of “Life” as the conceptual framework for “body and mind”
Living versus non-living or inanimate matter: Widely used but scientifically wrong terminology


This dichotomous wording dates from long before the chemical nature of “matter” became known. It is so deeply rooted in all languages, and it sounds so logical that everybody assumes that nothing is wrong with it. But at the present time we know that classical matter D atoms, and that the atoms present in living matter are not different from their counterparts in non-living matter. In other words, the atoms in living matter do not have a special feature as to make them “animated.” The difference living – non-living is not situated in the atoms themselves but in the way the entities in which they end up “behave.” When combined into entities organized as sender-receiver compartments, the combinations of atoms can engage, under proper conditions, in communication and problem-solving activities. Combinations of atoms that do not form sender-receiver entities cannot by themselves engage in communication activities (see later). In short: what is classically called “living matter” can communicate, while “non-living matter” never can on its own. Atoms alone do not make the difference, but the activities to which they are instrumental do. For pragmatic reasons, I accept that the living - non-living wording continues to be used for some time…
19 pg. Open Access paper (pdf).
 
Lots to ponder. Hmmm... David Eagleman, anyone?

For the moment, I’m having a bit of semantic whiplash difficulty with the foregoing title phrase “the biophysical essence of the immaterial dimension of Life.”
__________
 

Tuesday, March 7, 2023

Mythologies update, crytpocurrency edition II

CoffeeZilla rocks!

Recall this crap? And, this? And, this?
 
Celius CEO Alex Mashinsky at 2:40 in the above video:
“OK, how many people want to earn more money on their money—yield?” And we said “eight billion people!” “Of course I want to earn more money on my money.”
Eight billion? Roughly the current human world population—35-40% of whom have no access to the internet at all. Of the remainder with "access," much of that population lives within autocratic states where internet connectivity and transactions are highly surveilled and tightly controlled. The notion of worldwide unfettered (yet somehow "secure") private digital currency commerce is fatuous on its face.

More “Big Myth” stuff.
_________

Wednesday, March 1, 2023

The history of the Myth America Pageant

 
When I first opened this book after downloading it, Kindle estimated my reading time at "18 hrs 59 minutes." 824 pages of analytic historical whup-ass, elegantly written, consistently coherent.
 
And, the authors, while forthright regarding their points of view, are consistently charitable with respect to myriad contrary doctrinal assertions (while also calling BS where logic, facts, and, evidence warrant doing so).
 
Gonna be a while reviewing this one. Stay tuned. 
 
UPDATE: from a summary by George Hosea:
The Big Myth: How American Business Taught Us to Loathe Government and Love the Free Market, by Naomi Oreskes and Erik M. Conway, explores the historical and cultural roots of the popular belief in the United States that government is the enemy of free enterprise, and that the free market is the solution to all social and economic problems. The authors contend that this notion is not the natural product of a free society but the result of a purposeful effort by commercial interests to change public opinion and legislation in their favor. [It] traces the origins of this effort to the 1930s, when business executives were worried at the advent of the New Deal and the rising influence of labor unions. They worried that government interference in the economy would restrict their earnings and their influence over the workers.

To address this danger, industry leaders made a systematic effort to promote the advantages of the free market and demonize government involvement as an infringement on individual liberty. They supported think tanks, media outlets, and political campaigns that promoted this message and worked to sway public opinion. Throughout time, this effort proved so effective that it has become a deeply ingrained element of American society and politics.

Oreskes and Conway claim that this conviction in the free market and skepticism of government has had severe effects on American society, including increased inequality, a weakening of social safety nets, and a degradation of public goods like infrastructure and education. They argue that it is necessary to review these assumptions and to propose alternative forms of economic organization that promote social welfare and the common good.

To accomplish this, the authors believe that it is vital to confront conventional beliefs about the free market and government, observing that the free market is not a natural, self-regulating system but rather a human creation that needs government intervention to work efficiently. Oreskes and Conway also note that government is not an outside force that limits individual liberty, but is rather a reflection of the collective will of the people, and that its job is to guarantee that the advantages of economic activity are shared equitably.

The authors underline the necessity of understanding the historical environment in which these ideas took root, contending that the contemporary mistrust of government is not an inherent product of human nature but rather the result of particular historical events and political campaigns…

The Big Myth is a thought-provoking and informative exploration of the causes and effects of the overwhelming belief in the free market and skepticism of government in the United States ... Oreskes and Conway trace the roots of contemporary confidence in the free market back to the Gilded Age of the late 19th century, when corporate executives exploited their money and power to influence public opinion and legislation in their favor. effective in part because it was able to co-opt the vocabulary of populism and position economic concerns as identical to the interests of ordinary people…
Good enough to get us underway. I cleaned up a few grammatical items, but his overall take squares with the basic issues presented in the book.
__________
 
For openers, there is no such thing as "The Free Market."
If there were only one man in the world, he would have a lot of problems, but none of them would be legal ones. Add a second inhabitant, and we have the possibility of conflict. Both of us try to pick the same apple from the same branch. I track the deer I wounded only to find that you have killed it, butchered it, and are in the process of cooking and eating it.

The obvious solution is violence. It is not a very good solution; if we employ it, our little world may shrink back down to one person, or perhaps none. A better solution, one that all known human societies have found, is a system of legal rules explicit or implicit, some reasonably peaceful way of determining, when desires conflict, who gets to do what and what happens if he doesn’t.

The legal rules that we are most familiar with are laws created by legislatures and enforced by courts and police. But even in our society much of the law is the creation not of legislatures but of judges, embedded in past precedents that determine how future cases will be decided; much enforcement of law is by private parties such as tort victims and their lawyers rather than by police; and substantial bodies of legal rules take the form not of laws, but of private norms, privately enforced.


Friedman, David D.. Law's Order: What Economics Has to Do with Law and Why It Matters (p. 3). Princeton University Press - A. Kindle Edition.

"Private enterprise," "private markets," yeah, of course. But human transactional affairs get "governed" one way or another. Spare me the conflation. Moreover,

A sweeping historical read.

I am now 77, a first-wave Baby Boomer born in Feb. 1946. My parents were Depression-era kids. My Dad and all four of his brothers served in WWII, as did my mother's adult brothers. Much of the bulk of this book spans that period (Pop was born in 1916). I am quite familiar with the socio-political-economic and cultural trends of the era.

My personal awareness of the broader world began roughly with the Presidential tenure of Dwight Eisenhower (I recall the "I Like Ike" buttons). My parents were hardcore (albeit relatively passive) Goldwater Republicans. Patriotic "conservatives" with a fairly reflexive disdain for anything smacking of "socialism" or “worse.”

In light of our recent "MAGA" looniness, I irascibly posted this on Twitter last year:
 

It's easy to casually conclude this extreme "free market GOOD / government BAD" polarization is of relatively recent vintage, but the heated all-or-nothing rhetoric goes back a long time (even if things are further exacerbated by current-day digital social media). The Big Myth lays the long history out in great detail.

It's a history larded with cynicism, ill will, bad faith, and a cornucopia of logical/rhetorical fallacies: Magical Thinking, False Dichotomies, Appeal to Authority/Tradition, the Slippery Slope, Perfectionism Fallacy, Begging the Question, ad hominem attacks, correlation/causation conflation, and so on. The ever-handy toolkit of the propagandist.
 
The Rogues' Gallery: Von Mises, Hayek, Ayn Rand, Milton Friedman (with his egregious mischaracterizations of Adam Smith), Goldwater, Robert Bork, Ronald Reagan, and a sizable rogues gallery of lesser "free market uber alles" ideologues (along with their partisan "think tanks"), all given a thorough critical evaluation here. The "sacred, indivisible 3-legged stool" of The Free Market, Liberty, and Democracy (the latter in particular grossly mischaracterized), posited by neoliberals as the ONLY sociopolitical edifice holding off the Merciless Totalitarian Hordes, (and, oh yeah, before I forget, we now gotta toss in white Christian Nationalism to the Sacred Indivisible Liturgy)
...A key part of the manufacturers’ propaganda campaign was the myth of the Tripod of Freedom, the claim that America was founded on three basic, interdependent principles: representative democracy, political freedom, and free enterprise. This was a fabricated claim. Free enterprise appears in neither the Declaration of Independence nor the Constitution, and the nineteenth-century American economy was laced with government involvement in the marketplace. But NAM spent millions to convince the American people of the truth of the Tripod of Freedom, and to persuade Americans that the villain in the story of the Great Depression was not “Big Business,” but “Big Government.” [Oreskes & Conway, The Big Myth, p. 16].
Nothing much has materially changed.
 
CUTTING TO THE CHASE
CHAPTER 15
The High Cost of the “Free” Market


…In 2019, in a meeting that now seems clairvoyant, experts at the Center for Health Security at Johns Hopkins University addressed “preparedness for a high-impact respiratory pathogen pandemic.” Among their recommendations: countries should improve their core public health competencies; draw up national action plans, with strategies to make decisions quickly when needed and prepare for supply interruptions; and develop the capacity for “surge manufacturing in crisis.” Obviously, their advice was ignored.

Over the past thirty years, scientists’ counsel on a wide range of issues—from pandemic preparedness to climate change—has been widely discounted and sometimes rejected outright. A major reason is the influence of the thinking that insists on limiting the power and reach of the federal government and relying on markets to solve our problems. Most damagingly, the market-oriented framework of recent decades has resisted any facts—scientific, historical, sociological, or otherwise—that suggest a need for a strong, centralized, or otherwise coordinated governmental response.

In some countries, concentrated central power may be a threat to liberty, but the United States is not one of them, in part because the country was set up with that concern in mind. The conservative preoccupation with constraining government power has left us with a federal government too weak and too divided to handle big problems like Covid-19 and climate change. Even as the pandemic raged, millions of Americans refused to get vaccinated in large part because of distrust of “the government,” and the lion’s share of those Americans were political conservatives.

The steps necessary to avoid the worst effects of an emergent disease—stockpiling supplies, educating people about hand-washing and social distancing, developing accurate tests and implementing them equitably, and sustaining the research infrastructure that can kick in to develop a vaccine—are not readily undertaken by the private sector. There’s not much of a business case for stockpiling a billion face masks. Nor can we rely on the private sector to step up when a new virus emerges, because by then it is too late. The “just in time” supply model that dominates in business is efficient for many purposes, but it does not work in the face of a pandemic.

For any problem that has a scientific, medical, or technological component, the challenge is not simply to mobilize resources when they are needed, but to have them ready in advance. It takes a year or more to build a laboratory; it takes a decade to train a cadre of scientists and engineers. We could no more muster on demand the needed expertise and infrastructure to fight a pandemic than we could suddenly raise a professional military, replete with aircraft carriers and their air wings, within weeks of an attack. Nearly all conservatives acknowledge the need for military preparedness, yet they have been loath to allow that government is needed to address a wide range of problems—and not just scientific ones—that markets can’t or won’t solve on their own…
[The Big Myth, pp. 539-541]
In 2019, in a meeting that now seems clairvoyant, experts at the Center for Health Security at Johns Hopkins University addressed “preparedness for a high-impact respiratory pathogen pandemic.”

Yeah. From my blog in April 2020.

Box 6: What Would the 1918 Influenza Pandemic Look Like Today?
In the worst pandemic in recorded history, the 1918 influenza pandemic, the novel virus infected approximately one-third of the global population over a period of 2 years, ultimately leading to 50 to 100 million deaths world-wide. One might imagine that the death rate would be lower today due to the advent of modern medical equipment and procedures that did not exist in 1918, but the global population is now approximately 4 times greater than in 1918. This growth, however, is disproportionately higher in low- and middle-income countries, often the ones with developing health systems. Many—predominantly in Africa, Southeast Asia, Latin America, and the Middle East—have experienced population growths of 1,000% or more since 1918. Crowded urban areas provide prime conditions for the spread of respiratory diseases, and urbanization is increasing globally, including the emergence of 47 “mega-cities” (populations over 10 million). By comparison, London was the world’s largest city in 1918, at approximately 5 million people. Additionally, global travel has increased by orders of magnitude compared to 1918. Even then, shipping and population movement (including World War I) played an important role in global spread of the disease, but today, humans can fly anywhere in the world in less than 1 incubation period, meaning that global transmission can be expected to be even faster.

In 1918, the global case fatality ratio is estimated to have been 2.5%, but it was considerably greater in low- and middle-income countries, with some estimates exceeding 10%. Today, some high-income countries would be expected to fare much better because of modern health care, but the case fatality in countries with limited access to healthcare could be as bad as or worse than 1918. Simple arithmetic would suggest the possibility of 100 to 400 million deaths if a 1918-like pandemic were to occur today, but unprepared or under-resourced health systems could further exacerbate disease transmission through nosocomial spread and an inability to promptly diagnose and render care, a particular concern for developing health systems. During the 2003 SARS epidemic, 72% and 55% of presumed and confirmed cases in Toronto and Taiwan, respectively, occurred as a result of healthcare transmission. A similar nosocomial outbreak in which healthcare facilities became amplifiers of the epidemic, this time of MERS, happened in South Korea in 2015.
[Page 47 of 83]

Yes, “obviously, their advice was ignored.”

Hat tip to Dr. Naomi Oreskes.
 
:SOCIOECONOMIC DRAMA. FROM A 2012 POST OF MINE
 
I think this tangentially fits the Big Myth topic.
 

There is a long-established, conceptually simple, cogent cognitive psych model of dysfunctional interpersonal behavior known as "Script Theory." It emanated decades ago from Eric Berne's work on "Transactional Analysis." Within this model is the Rescuer-Victim-Persecutor concept:
  • I, the benevolent intervenor, arrogate to myself the right to and imperative of "rescuing" you, the "Victim" from your plight(s);
  • You, the "Victim," irritated by (or just apathetic toward) my putatively altruistic and unsolicited ministrations, react with insufficient gratitude and attitude/behavior change;
  • which then give me the right to demonize and persecute you.
Think about it.

The supportive psych literature is rather voluminous.


I pretty much buy it. Occam's Razor simplicity and all that. Think about the drama you repeatedly witness (or participate in) within your family and social circles. 


Shorter Claude Steiner: to the extent that you live a "scripted" life, you are not free.
 
1971
Of late I can't help but observe that our society in the aggregate is now in the "Persecutor" phase of socioeconomic dynamics, in the wake of our recent disappointments. Ironic, given all this hyperbolic talk about "freedom" of late.

It's probably cyclical. We tend to oscillate between maxima and minima of concerns over "social justice."


Unless you've been off incommunicado in a cave of late, you've seen it.

  • Presidential Candidate Ron Paul gets loud, angry cheers during a GOP primary "debate" wherein he summarily shrugs off the moral implications of allowing the destitute to die at the ER curbside (and, he's a physician, no less).
  • The fatuous writings of the late Ayn Rand (raging against "Moochers" and "Looters") have risen to new popularity.
  • A nationally known AM radio host loudly demeans a woman as a "slut" and a "prostitute"because of her advocacy for contraceptive rights.
  • The long-term jobless are described as "lazy." It is argued, among other things, that they be subjected to drug testing as a condition of eligibility for unemployment compensation. Ron Paul's senator son Rand claims in late 2013 that extended employment insurance is a "disservice" to the jobless.
I could do a very long bullet list.

But, you get the idea. The "failures of liberalism" give us convenient license to blame the the unfortunate, poor, and inept.

The adversity POV:
Ich, Du, Sie

  • I innocently suffered a misfortune.
  • You should have done more to avoid calamity.
  • He is is a parasite, a Moocher.
We tend to attribute our successes to acumen and initiative, while declaiming responsibility for our misfortunes, which are proffered to be the result of bad luck or the machinations of more powerful adversarial others. 
 
'Eh?
 
BTW, you might enjoy my 2008 recounting of my 5 year stint in subprime risk management. See "Tranche Warfare."

 
 
ALSO—"CRYPTO," ANYONE?
 
Gresham's Dynamic on steroids.
 
RECAPPING

__________