"More Disruption Please" - athenahealth CEO Jonathan Bush
Happy New Year. I guess.
In 2016 we certainly had domestic political disruption the likes of which I've never before seen across my 70 years. I remain mentally enervated in the wake of the fractious, garishly low-road 2016 political campaign, and rather aghast (though hardly surprised) at the outcome. Newly-empowered hard-right Republicans, under the Presidency of the garrulous, shamelessly contradictory Donald J. Trump (and his even more radical, nihilist "alt-right" supporters), are positively chomping at the bit to begin breaking things -- uh, disrupting the status quo across the breadth of domestic and international policy fronts. From "repealing ObamaCare" to more zealous (and collateral damage indifferent) prosecution of the "war on terror" (and ramped-up aggressive militarization more broadly) to mass deportations to blunt-axe, broad-brush regulatory and civil rights rollbacks to drug policy to trade policy to privatization attempts spanning a gamut of arenas inclusive of Medicare, Medicaid, the VA, Social Security, and education, to say "everything's on the table" is to put it mildly. We're gonna need a bigger table.
Then again, maybe most of these concerns will prove largely unfounded, given the formidable inertial weight of incumbent politics -- i.e., "the Swamp" that is unlikely to be drained notwithstanding the Trump campaign rhetoric.
My wife and I will be paying particular attention to GOP attempts to eviscerate Medicare and Social Security. She retires in June, and we will thereafter be dependent on Medicare (plus our remaining HSA stash) for our health coverage and our Social Security pension benefits and our IRAs. Feeling just a bit of anxiety over being at the mercy of the disruptive political winds going forward.
BOBBYG'S NEVER-ENDING BOOK REVIEWS
My latest read, on -- well -- "disruption." Read this Friday on BART and between loads of doggie laundry amid my weekly Muttville.org volunteer shift.
This book examines the increasingly rapidly changing nexus between digitally-empowered citizens and their governments, both here in U.S. and around the planet.
Disruption has become one of Silicon Valley’s most popular, if cloying, buzzwords. One is hard pressed to find a startup that does not describe itself as a disruptive technology, or a company founder who is reluctant to take on the establishment. The concept has also come to stand for a form of libertarianism deeply rooted in the technology sector, a sweeping ideology that goes well beyond the precept that technology can engage social problems to the belief that free market technology-entrepreneurialism should be left unhindered by the state...
...[W]ell-established companies are ahead in developing new technologies that meet the needs of established customers, but they cannot see beyond the worldview that made them successful. This blind spot allows new companies to innovate on the margins. Disruptive technologies first find a niche audience, and once their value is proven, they widen their market, taking down the establishment. In short, hierarchical institutions with entrenched practices, interests, and consumers are bad at anticipating and catering to new markets and are therefore vulnerable to nimble innovators..."One is hard pressed to find a startup that does not describe itself as a disruptive technology, or a company founder who is reluctant to take on the establishment."
Government has all the burdens of established corporations: institutionalized structures and norms that lead to lethargy, waste, inefficiency, and a lack of innovation. But their purpose is different from that of corporations, which have a mission to maximize value for their shareholders. In the capitalist model, we hope that the collective impact of the private sector benefits everyone to some extent. In the public sector, however, the very mandate is to serve everyone. Disruption theory explains the failure of institutions to innovate and their risks of collapse, not the social consequence of that failure. The Kodak workers who lost jobs, or towns where the steel mills closed, are not the core focus of business theory. And herein lies the problem for the state.
Disruptive innovation—from Anonymous, to cryptocurrencies like Bitcoin, to grassroots mapping of natural disasters—is challenging many core functions of the international system, functions once controlled by states and international institutions.
For now, the challenge posed by disruptive innovation does not mean the end of the state, but it does suggest that the state is in decline, exposing laws, ethics, norms of behavior, and hierarchical structures that emerged amid an older set of technologies as constraints. Put another way, the state is losing its status as the pre-eminent mechanism for collective action. Where it used to be that the state had a virtual monopoly on the ability to shape the behavior of large numbers of people, this is no longer the case. Enabled by digital technology, disruptive innovators are now able to influence the behavior of large numbers of people without many of the societal constraints that have developed around state action. These constraints, which disruption theory treats as weaknesses, have historically been strengths of democratic societies: They hold government accountable and ensure that it operates within the rule of law and within the bounds of prevailing moral and ethical norms. There are of course varying degrees of success within this framework, but the idea of collective representation via institutional governance is what has separated modern democratic societies from anarchy...
Owen, Taylor (2015-03-02). Disruptive Power: The Crisis of the State in the Digital Age (Oxford Studies in Digital Politics) (pp. 6-10). Oxford University Press. Kindle Edition.
The hapless Theranos, anyone?
I found Disruptive Power a compelling read. Finished it in one day. Chapter 8 is particularly interesting. Recall that controversial "libertarian" digital investor and Donald Trump billionaire supporter Peter Thiel (co-founder of the "big data" company Palantir) is now a close advisor to the incoming President (he was present at the recent Trump Tower "digital summit" attended by most of the top Silicon Valley CEOs).
The Violence of AlgorithmsChapter 8 continues on to list myriad activities of private corporate digital initiatives where companies are essentially "working both sides of the street," at once enabling innovative "digital democracy" efforts and concomitantly and quietly lending their cutting-edge technical capabilities to numerous states -- many of them hostile to U.S. (and private citizens') interests.
In December 2010, I attended a training session in Tysons Corner, Virginia, just outside Washington, DC, for an intelligence analytics software program called Palantir. Co-founded by Peter Thiel, a libertarian Silicon Valley billionaire from PayPal and Facebook, Palantir is a slick tool kit of data visualization and analytics used by the NSA, FBI, CIA, and other US national security and policing institutions. As far as I could tell, I was the only civilian in the course, which I took to explore Palantir’s potential for use in academic research.
Palantir is designed to pull together as much data as possible, then tag it and try to make sense of it. For example, all of the data about a military area of operation, including base maps, daily intelligence reports, mission reports, and the massive amounts of surveillance data now being collected could be viewed and analyzed for patterns in one platform. The vision being sold is one of total comprehension, of making sense of a messy operating environment flooded with data. The company has a Silicon Valley mentality: War is hell. Palantir can cut through the fog.
The Palantir trainer took us through a demonstration “investigation.” Each trainee got a workstation with two screens and various datasets: a list of known insurgents, daily intelligence reports, satellite surveillance data, and detailed city maps. We uploaded these into Palantir, one by one, and each new dataset showed us a new analytic capability of the program. With more data came greater clarity— which is not what usually happens when an analyst is presented with vast streams of data.
In our final exercise, we added information about the itinerary of a suspected insurgent, and Palantir correlated the location and time of one meeting with information it had about the movements of a known bombmaker. In “real life,” the next step would be a military operation: the launch of a drone strike, the deployment of a Special Forces team. Palantir had shown us how an analyst could process disparate data sources efficiently to target the use of violence. It was an impressive demonstration, and probably an easy sell for the government analysts taking the course.
But I left Tysons Corner with plenty of questions. The data we input and tagged included typos and other mistakes, as well as our unconscious biases. When we marked an individual as a suspect, that data was pulled into the Palantir database as a discrete piece of information, to be viewed and analyzed by anyone with access to the system, decontextualized from the rationale behind our assessment. Palantir’s algorithms—the conclusions and recommendations that make its system “useful”—“useful”— carry the biases and errors of the people who wrote them. For example, the suspected insurgent might have turned up in multiple intelligence reports, one calling him a possible threat and another that provided a more nuanced assessment of him. When the suspected insurgent is then cross-referenced with a known bombmaker, you can bet which analysis was prioritized. Such questions have not slowed down Palantir, which developed a billion-dollar valuation faster than any other American company before it, largely due to its government security contracts. In 2014 Palantir’s value was between $ 5 billion and $ 8 billion dollars.
And analysts who use it have no shortage of data to feed into the system. All around us sensors are collecting data at a scale and with a precision that in many cases is nearing real-time total surveillance... [ibid, pp. 168-170].
Given that AI/Machine Learning advances are at the forefront of DigiTech these days, (e.g., see my posts "The Great A.I. Awakening? Health care implications?" and "What might Artificial Intelligence bring to humanity?") I find Taylor Owens book quite instructive as we think about where things may head this year. Well worth your time.
BTW: I came by the Owen book in the wake of searching the word "agnotology" ("the study of culturally induced ignorance or doubt, particularly the publication of inaccurate or misleading scientific data") after reading this article: "AN INTERVIEW WITH MICHAEL BETANCOURT, AUTHOR OF AGNOTOLOGY & CRISIS IN DIGITAL CAPITALISM."
...The issue with digital production is not really about hand labor or digital labor, it’s about how digital production generates an illusion of separation between effects and means—the disconnection between what you do with a computer system, such as download a file from an Internet server, and the resources and physical supports required to make that download possible. This idea appears as the belief that the digital ends “scarcity,” that it eliminates costs and makes everything equally available to everyone. While we as a society consciously know these things cannot be true, at the same time, the behaviors that the “aura of the digital” describes all proceed as if they were true: it is not simply an issue of consumption; it is also (perhaps even more so) a description of expectations for how economic and social policy should be formulated.
The element of labor in all this is only the smallest part of what’s happening with digital capitalism: this general rejection of anything built in the physical world, including all the laws, regulations, protections, and social conventions that make society function. Anything that impedes the expansion of digitally implemented capitalist protocols is conceived as either quaint or an irrelevant vestige that impedes economic “innovation and growth.” This conception is a fundamental element of how these transformations are justified by the defenders of this new economy, an issue that periodically receives acknowledgement in the news; journalist Paul Carr’s discussion from 2012 is typical:
The pro-Disruption argument goes like this: In a digitally connected age, there’s absolutely no need for public carriage laws (or hotel laws, or food safety laws, or… or…) because the market will quickly move to drive out bad actors...
Carr’s summary is to the point; the idea that digital technology negates the need for established protections ignores the harm that happens while waiting for these market-based “corrections.” It is a demonstration of how the aura of the digital eliminates physical impacts from consideration, but his description captures the nature of digital capitalism’s refusal of established social restraints: existing laws are simply an impediment to the expansion of digital technology. Belief in the transcendent aspects of this implementation means there is a blindness to the historical lessons and battles fought to gain the protections that are now simply being ignored. The superficial objectivity of the computer systems that are supposed to somehow replace established protections are simply machinic function—the uniform imposition of whatever ideology informs the design; machines are never impartial, they reify the beliefs they are built to enact. The rhetoric around Bitcoin, the sharing economy, social networks, digital distribution of media, etc. all reflect this process and the (usually implicit) demand that laws be replaced by unregulated, entirely new digital systems where the market will police itself without the need for oversight...
Consider how the so-called “sharing economy” operates: a few software companies introduce digital systems to facilitate some type of transaction. These middlemen connect customers with other providers while taking a large share of the transaction fees. But these software companies do not employ the providers—they are simply a medium, making the transaction possible. So the costs associated with whatever service is being provided—whether it’s a hotel room, taxi ride, or anything else—primarily fall on the provider. These are physical costs that are not the concern of the middleman software company, but they are costs for the people who actually do the work. The sharing economy is thus a parasitic exploitation where the physical costs associated with the business are not a part of the business model at all—nor does the business itself directly address them. This elision is typical of how the aura of the digital hides these concerns with physical resources and costs: CheapHotels, Uber, Airbnb, all of these companies reveal the same underlying process where the physical costs, legal restrictions, and social impacts all disappear from consideration.Interesting interview.
The agnotological element is implicit in this entire process: it is what makes these disappearances from consideration seem not only normal but appropriate. Agnotology is a general term for this type of artificially produced ignorance—it is the inability to recognize that the sharing economy or social networking or any of the various big data companies—makes factual statements become controversial, and invites counter arguments about basic information statements. With critiques of sharing economy, for example, companies such as Uber or Lyft, the answer is simply that the company allows people to make use of what they already own to turn a profit from their possessions, a claim that makes these companies sound like they are some type of global rummage sale when they are not...
As I observed earlier, the "disruptions" are likely to cut both ways. And, on the downside, the resultant social/civic wounds might well be very serious.
UPDATE
Looks like this book may have some topical relevance to the foregoing, as it goes to "public accountability."
Amazon blurb: "Investigative journalism holds democracies and individuals accountable to the public. But important stories are going untold as news outlets shy away from the expense of watchdog reporting. Computational journalism, using digital records and data-mining algorithms, promises to lower the cost and increase demand among readers, James Hamilton shows."Looks interesting. Not sure I'll buy it. A bit pricey for the Kindle edition.
You might also recall my 2015 reporting on similar books such as "Spying on Democracy."
1/9 Update: I reached out to the author, Dr. Hamilton at Stanford. He's generously sending me a comp hardcopy edition of "Democracy's Detectives." I will be reviewing it soon (probably over at one of my other blogs), along with this one I'm nearly finished with.
Democratic governments the world over are increasingly paralyzed, unable to act on many key issues that threaten the economic and environmental stability of their countries and the world. They often enact policies that seem to run against their own interests, quashing or directly contradicting well-known evidence. Ideology and rhetoric guide policy discussions, often with a brazenly willful denial of facts. Even elected officials seem willing to defy laws, often paying negligible prices. And the civil society we once knew now seems divided and angry, defiantly embracing unreason. Everyone, we are told, has his or her own experience of reality, and history is written by the victors. What could be happening?
At the same time, science and technology have come to affect every aspect of life on the planet...An excellent, bracing, albeit depressing read, particularly in light of stuff like this:
Otto, Shawn Lawrence (2016-06-07). The War on Science: Who's Waging It, Why It Matters, What We Can Do About It (Kindle Locations 159-165). Milkweed Editions. Kindle Edition.
Is Anyone Actually a Scientist?SPEAKING OF BOBBYG'S BOOK REVIEWS...
Forget the terrible “I’m not a scientist” schtick. Trump’s comments on climate change suggest no one is a scientist.
Recall my April 2016 look at Dan Lyon's hilarious book "Disrupted."
UPDATE: ANOTHER LITTLE "DISRUPTION" THINGY
"[I]n Silicon Valley, you can’t merely make a better typewriter and sell that at a profit. No, you have to DISRUPT. You have to REINVENT. Well, at least you need the appearance of that, while you squeeze eyeballs until they pop out enough advertising dollars to give the VCs that 10x return."From the post "Venture capital is going to murder Medium."
I've used the Medium platform episodically. I find it a relatively weak authoring tool. The "Theranos" of independent blogging/online "journalism"?
TECH ERRATA
A simple guide to CRISPR, one of the biggest science stories of 2016Another biggie to watch this year.
Relatedly, STATnews chimes in on 2017 speculation.
10 stories that are key to understanding 2017Below, a nice Silicon Valley rumination:
In 2016, the tech industry forgot about peopleYeah. Read her entire article. Well done.
Oh, the humanity
by Lauren Goode @LaurenGoode
It’s been more than three years since I moved to Silicon Valley, and so far everything they say is true: it’s a place driven by optimism, hope, and high-priced electric vehicles. It’s a place where innovation thrives, and where failure is not only forgiven, but sometimes even lauded as part of a larger narrative. Those are the upsides.
The downsides can be less obvious, but 2016 brought them to light in cruel and unusual ways, as Fortune writer Erin Griffith points out in a recent article. The tech industry often operates under the belief that the world’s ails can be cured or at the very least sanitized with just the right kind of tech, from inconsequential things (can’t find a date? Just swipe) to matters of convenience (a drone will deliver that for you) to the literal difference between life and death (don’t worry, big data will help you live to 120). These are all human problems. And yet 2016 was the year tech seemed to forget about the humans: the way we work, what makes us tick, and that we’re all, actually, fragile and complex beings...
REMINDER: DON'T FORGET
Register here. I will again be there.
____________
More to come...
No comments:
Post a Comment