Search the KHIT Blog

Tuesday, September 6, 2016

In the wake of Labor Day, thinking about jobs, education, healthcare, and tech


Labor Day parade, circa early 1900's. Times have changed.

I hope everyone had an enjoyable Labor Day weekend.

Among my many topical titles relevant to the overlapping health care and health tech spaces, I continue to read a lot of "futurism" books. e.g.,


Finished Brett King's rhapsodic take on all incipient things tech last week. The Amazon blurb:
The Internet and smartphone are just the latest in a 250-year- long cycle of disruption that has continuously changed the way we live, the way we work and the way we interact. The coming Augmented Age, however, promises a level of disruption, behavioural shifts and changes that are unparalleled. While consumers today are camping outside of an Apple store waiting to be one of the first to score a new Apple Watch or iPhone, the next generation of wearables will be able to predict if we’re likely to have a heart attack and recommend a course of action. We watch news of Google’s self-driving cars, but don’t likely realise this means progressive cities will have to ban human drivers in the next decade because us humans are too risky. Following on from the Industrial or machine age, the space age and the digital age, the Augmented Age will be based on four key disruptive themes—Artificial Intelligence, Experience Design, Smart Infrastructure, and HealthTech. Historically the previous ‘ages’ bought significant disruption and changes, but on a net basis jobs were created, wealth was enhanced, and the health and security of society improved. What will the Augmented Age bring? Will robots take our jobs, and AI’s subsume us as inferior intelligences, or will this usher in a new age of abundance?

Augmented is a book on future history, but more than that, it is a story about how you will live your life in a world that will change more in the next 20 years than it has in the last 250 years. Are you ready to adapt? Because if history proves anything, you don't have much of a choice.
A good read, if a tad too optimistic in a number of areas. Though, I did like that he cited another optimist a number of times -- Wired co-founder and original thinker Kevin Kelly, who I've cited before. See, e.g., my May post "Anything that CAN be tracked WILL be tracked." Inevitable Tech Forces That Will Shape Our Future.

The Amazon blurb for Alec Ross's very nice "The Industries of the Future."
Leading innovation expert Alec Ross explains what’s next for the world: the advances and stumbling blocks that will emerge in the next ten years, and how we can navigate them.

While Alec Ross was working as Senior Advisor for Innovation to the Secretary of State, he traveled to forty-one countries, exploring the latest advances coming out of every continent. From startup hubs in Kenya to R&D labs in South Korea, Ross has seen what the future holds.

In The Industries of the Future, Ross shows us what changes are coming in the next ten years, highlighting the best opportunities for progress and explaining why countries thrive or sputter. He examines the specific fields that will most shape our economic future, including robotics, cybersecurity, the commercialization of genomics, the next step for big data, and the coming impact of digital technology on money and markets.

In each of these realms, Ross addresses the toughest questions: How will we adapt to the changing nature of work? Is the prospect of cyberwar sparking the next arms race? How can the world’s rising nations hope to match Silicon Valley in creating their own innovation hotspots? And what can today’s parents do to prepare their children for tomorrow?

Ross blends storytelling and economic analysis to give a vivid and informed perspective on how sweeping global trends are affecting the ways we live. Incorporating the insights of leaders ranging from tech moguls to defense experts, The Industries of the Future takes the intimidating, complex topics that many of us know to be important and boils them down into clear, plainspoken language. This is an essential book for understanding how the world works—now and tomorrow—and a must-read for businesspeople in every sector, from every country.
Also a good read.

I continue reflect at length on the implications of my accruing recent book stash spanning a look-back to the origins of the universe and biological evolution through the various "disruptive" socioeconomic inflection eras to speculations on where humanity is headed in a world of exponentially accelerating cultural and technological evolution.


To these one is advised to leaven one's thinking with such as Paul Mason's bracing "Postcapitalism" and Joel Kotkin's excellent "The New Class Conflict."


In general, the concern goes to the question "what if half or more of employment just goes away, permanently, replaced by AI and robotics?"
In this regard. see the often-cited Oxford Martin study on the future of employment and computerization. Brett King et al cite it in Augmented.

Highly recommend, again, that you also take the time to read and reflect on "Four Futures," now an online essay at Jacobin, soon to be a book release.
Much of the literature on post-capitalist economies is preoccupied with the problem of managing labor in the absence of capitalist bosses. However, I will begin by assuming that problem away, in order to better illuminate other aspects of the issue. This can be done simply by extrapolating capitalism’s tendency toward ever-increasing automation, which makes production ever-more efficient while simultaneously challenging the system’s ability to create jobs, and therefore to sustain demand for what is produced. This theme has been resurgent of late in bourgeois thought: in September 2011, Slate’s Farhad Manjoo wrote a long series on “The Robot Invasion,” and shortly thereafter two MIT economists published Race Against the Machine, an e-book in which they argued that automation was rapidly overtaking many of the areas that until recently served as the capitalist economy’s biggest motors of job creation. From fully automatic car factories to computers that can diagnose medical conditions, robotization is overtaking not only manufacturing, but much of the service sector as well.

Taken to its logical extreme, this dynamic brings us to the point where the economy does not require human labor at all. This does not automatically bring about the end of work or of wage labor, as has been falsely predicted over and over in response to new technological developments. But it does mean that human societies will increasingly face the possibility of freeing people from involuntary labor. Whether we take that opportunity, and how we do so, will depend on two major factors, one material and one social...
So, where are we today? What kind of educational pursuits can you even take to immunize yourself from employment obsolescence? How will you pay for adequate, effective health insurance or health care if you simply cannot find any work? See my 2015 "The Robot will see you now -- assuming you can pay."

See also "Are you ready for the jobs revolution?"
Welcome to the age of uncertainty. Never has it been harder to predict what our working futures hold and arm ourselves accordingly, as technological advances, an ageing workforce and the rise of the sharing economy cause new jobs to appear as fast as others vanish.

Traditional career paths are on the way out, say the experts, with jobs-for-life replaced by a new expectation among younger workers that they might hold 11 or more jobs throughout their lives, according to Karin Klein of Bloomberg Beta.


This is the new gig economy, and it’s gaining ground. In its wake has come a culture of short-termism, an on-demand approach to work which is spreading through professions and occupations, causing a rise in freelance work and part-time jobs...
THE "1099 ECONOMY"

IRS Form 1099. The "self-employment" / "independent contractor" annual income reporting equivalent of the employee W-2.
The 1099 is nothing new. In 1978 I formed my last band, a five member R&B band in Knoxville, prior to finally going to college at Tennessee and getting out of the music business. I set up a Sub-S corporation via which to administer the business side of things. I paid myself a small salary, and paid my four sidemen equal splits as 1099 "contractors."

We disbanded in early 1980 after two years of scrapping it out on the road averaging about 2,000 miles a month with no real progress in sight.

I legally "laid myself off" and drew my unemployment. In September of that year, at the age of 34, I began my Freshman year at UTK.
Fast-forward to current times. A Salon.com article excerpt; writer Steven Hill of the New America Foundation.
After working for many years in the Washington, D.C.–based think-tank world, the program that I directed lost most of its funding and was shut down shortly thereafter. All my employees, myself included, were laid off. I was promoting my latest book that had been published a few months before, so I surfed that wave for many more months. For a while, all seemed normal and natural, but without realizing it I had stepped off the safe and secure boat of having what is known as a “good job,” with a steady paycheck, secure employment and a comprehensive safety net, into the cold, deep waters of being a freelance journalist.

Suddenly I was responsible for paying for my own health care, arranging for my own IRAs and saving for my own retirement. I also had to pay the employer’s half of the Social Security payroll tax, as well as Medicare — nearly an extra 8 percent deducted from my income. The costs for my health-care premiums zoomed out of sight, since I was no longer part of a large health-care pool that could negotiate favorable rates...


In short, I had to juggle, juggle, juggle, while simultaneously running uphill — my life had been upended in ways that I had never anticipated. And I began discovering that I was not alone. Many other friends and colleagues — including Pulitzer Prize-winning journalists, professionals and intellectuals, as well as many friends in pink-, white- and blue-collar jobs — also had become 1099 workers, tumbleweeds adrift in the labor market. They found themselves increasingly faced with similar challenges, each in his or her own profession, industry or trade. In short, we had entered the world of what is known as “precarious” work, most of us wholly unprepared.

Not to worry. The sharing economy visionaries — who like Dr. Pangloss in Voltaire’s Candide always see “the best of all possible worlds” — had a plan in place for us. We could “monetize” our assets — rent out our house, our car, our labor, our driveway, our spare drill and other personal possessions — using any number of brokerage websites and mobile apps like TaskRabbit, Airbnb, SnapGoods, the ride-sharing companies Uber and Lyft, and more.

This is the new economy: contracted, freelanced, “shared,” automated, Uber-ized, “1099-ed.” In essence, the purveyors of the new economy are forging an economic system in which those with money will be able to use faceless, anonymous interactions via brokerage websites and mobile apps to hire those without money by forcing an online bidding war to see who will charge the least for their labor, or to rent out their home, their car, or other personal property.

Websites like Uber, Elance-Upwork, TaskRabbit, Airbnb and others are taking the Amazon/eBay model the next logical step. They benefit from an aura that seems to combine convenience with a patina of revolution; convenience as revolution. The idea of a “sharing” economy sounds so groovy — environmentally correct, politically neutral, anti-consumerist and all of it wrapped in the warm, fuzzy vocabulary of “sharing.” The vision has a utopian spin that is incredibly seductive in a world where both government and big business have let us down by leading us into the biggest economic crash since the Great Depression.


But the “sharing” economy’s app- and Web-based technologies have made it so incredibly easy to hire freelancers, temps, contractors and part-timers, why won’t every employer eventually lay off all its regular, W-2 employees and hire 1099 workers? Any business owner would be foolish not to, as he or she watches their competitors shave their labor costs by 30 percent (by escaping having to pay for an employee’s safety net and other benefits).

Outsourcing to these 1099 workers has become the preferred method for America’s business leaders to cut costs and maximize profits. One new economy booster clarified employers’ new strategy: “Companies today want a workforce they can switch on and off as needed” — like one can turn off a faucet or a radio.

Indeed, the so-called “new” economy looks an awful lot like the old, pre-New Deal economy – with “jobs” amounting to a series of low-paid micro-gigs and piece work, offering little empowerment for average workers, families or communities. We’re losing decades of progress, apparently for no other reason than because these on-demand companies conduct their business over the Internet and apps, somehow that makes them “special.” Technology has been granted a privileged and indulged place where the usual rules, laws and policies often are not applied.

If that practice becomes too widespread, you can say goodbye to the good jobs that have supported American families, goodbye to the middle class and say adios to the way of life that made the United States the leading power of the world.
The cover of his book:


I've not bought and read it yet. I get the picture.

In Brett King's book "Augmented," the uncritical lauding of the "gig economy" in general and Uber in particular got a bit tiresome after a while (keyword searching "Uber" alone yielded 65 hits).
Gigging, Job-hopping and Cloud-based Employment 
It was recently reported by Mintel that almost a quarter of Millennials would like to start their own businesses, and nearly one in five planned to do so in the next 12 months. In markets like the United States or Australia where the cost of college education is becoming either unattainable or a poor investment for large swathes of the population, many of this generation are choosing instead to be educated by online platforms, hackathons, internships, start-ups and experimentation rather than through traditional college approaches. With this alternative approach to education, this tech-savvy generation is increasingly demanding flexibility with employment. A total of 66 per cent of Millennials would be willing to wear technology to help them do their jobs. In fact, 40 to 45 per cent of Gen Y regularly use their personal smartphones and download apps specifically for work purposes (as opposed to 18 to 24 per cent of older generations). 

In the United Kingdom, 85 per cent of Gen Y graduates think that freelance or independent working will become a more common and accepted way to succeed in the job market over the next five years. In fact, freelancing is becoming so common amongst Millennials that they’ve even come up with their own term for it— gigging. As in “I’ve got a gig at Google.” Others call them “permanent freelancers” or “permalancers”. Increasingly, this type of work is done at home, at a shared workspace or even at a Starbucks. There are even websites dedicated to helping giggers find coffee shops that can be used as workspaces. It’s hardly surprising, then, that almost half of Millennials surveyed in the United Kingdom and the United States show a strong preference for this sort of working lifestyle. 

The full-time job is historically an anomaly. Prior to the industrial age, it didn’t really exist. Early industrialists, who needed to have workers on a production line at the same time for efficiency, are most likely responsible for creating the concept of a structured work week. Consequently, for the last 100 years, the 40-hour-a-week job has been the centrepiece of work life simply because there was no better way for people to gather in one place at the same time to connect, collaborate and produce. 

Now technology is changing the very nature of work. Millennials will be the first modern generation to work in multiple “micro-careers” at the same time, leaving the traditional full-time job or working week behind. “Work” is more likely to behave like a marketplace in the cloud than behind a desk at a traditional corporation. While a central skill set or career anchor will be entirely probable, most will be entrepreneurs, and many will have their side gigs. For instance, Uber, Lyft and Sidecar are platforms that give people a way to leverage their cars and time to make money. TaskRabbit is a market for odd jobs. Airbnb lets you rent out any extra rooms in your home. Etsy is a market for the handmade knick-knacks or 3D print designs that you make at home. DesignCrowd, 99designs and CrowdSPRING all offer freelance design resources that bid logos and other designs for your dollars. 

Before long, technology will allow instant marketing of your skill set, the auctioning of gigs and expertise, and the ability to be paid for your work in near real time or as deliverables are finished.

King, Brett; Lark, Andy; Lightman, Alex; Rangaswami, JP (2016-05-15). Augmented: Life in The Smart Lane (Kindle Locations 1062-1089). Marshall Cavendish International (Asia) Pte Ltd. Kindle Edition.
"...instant marketing of your skill set, the auctioning of gigs and expertise, and the ability to be paid for your work in near real time or as deliverables are finished."

Yeah. Just don't do any work for Donald Trump without getting paid in advance (and good luck with that demand).

UPDATE: hat tip to my FB friend Ann Neumann for this link: "Teachers Are Working for Uber Just to Keep a Foothold in the Middle Class."

BTW: King glosses over the now-moribund Theranos with equal ease:
Microfluidics is going to radically change the way we think about health care and diagnosis. Think about pretty much any condition you might need to be tested for today— high cholesterol, diabetes, kidney or liver problems, iron deficiency, heart problems, sexually transmitted infections, anaemia, hepatitis, HIV and various viruses— and they are typically tested for either via blood being drawn or via urinalysis. With blood tests, it typically involves drawing a significant amount of blood to get accurate tests. Here is where technology advancements will dramatically change these tests. 

While facing some controversy recently, the Silicon Valley start-up Theranos was one of the first to really tackle this problem. Theranos has developed blood tests that can help detect dozens of medical conditions based on just a drop or two of blood drawn with a pinprick from your finger. At Walgreens pharmacies across the United States, you simply show a pharmacist your ID and a doctor’s note, and you can have your blood drawn right there. From that one sample, several tests can be run often at a fraction of the cost of a typical pathology test. A typical lab test for cholesterol, for example, can cost $ 50 or more in the United States. The same Theranos test at Walgreens costs around US $ 3. [ibid, Kindle Locations 2577-2586]. 
PM UPDATE

Just in. Saw this in my iPhone.

EXCLUSIVE: HOW ELIZABETH HOLMES’S HOUSE OF CARDS CAME TUMBLING DOWN

In a searing investigation into the once lauded biotech start-up Theranos, Nick Bilton discovers that its precocious founder defied medical experts—even her own chief scientist—about the veracity of its now discredited blood-testing technology. She built a corporation based on secrecy in the hope that she could still pull it off. Then, it all fell apart.
It's worse than even I had thought. A number of Silicon Valley VC haircuts draw nigh.
In Silicon Valley, every company has an origin story—a fable, often slightly embellished, that humanizes its mission for the purpose of winning over investors, the press, and, if it ever gets to that point, customers, too. These origin stories can provide a unique, and uniquely powerful, lubricant in the Valley. After all, while Silicon Valley is responsible for some truly astounding companies, its business dealings can also replicate one big confidence game in which entrepreneurs, venture capitalists, and the tech media pretend to vet one another while, in reality, functioning as cogs in a machine that is designed to not question anything—and buoy one another all along the way.
It generally works like this: the venture capitalists (who are mostly white men) don’t really know what they’re doing with any certainty—it’s impossible, after all, to truly predict the next big thing—so they bet a little bit on every company that they can with the hope that one of them hits it big. The entrepreneurs (also mostly white men) often work on a lot of meaningless stuff, like using code to deliver frozen yogurt more expeditiously or apps that let you say “Yo!” (and only “Yo!”) to your friends. The entrepreneurs generally glorify their efforts by saying that their innovation could change the world, which tends to appease the venture capitalists, because they can also pretend they’re not there only to make money. And this also helps seduce the tech press (also largely comprised of white men), which is often ready to play a game of access in exchange for a few more page views of their story about the company that is trying to change the world by getting frozen yogurt to customers more expeditiously. The financial rewards speak for themselves. Silicon Valley, which is 50 square miles, has created more wealth than any place in human history. In the end, it isn’t in anyone’s interest to call bullshit...
__

I have little doubt that we are potentially on the cusp of truly "transformative," widespread deployment of health tech, finally passing by the chronic "Free Beer Tomorrow" problem. In a few weeks I'll be covering the 10th Annual Health 2.0 Conference, where I'm sure to be enthralled by the latest in Health Tech whiz-bang.


Brett King on nascent health tech promise:
In much the way GPS or navigation software can predict how traffic is going to impact your journey or travel time, over the next decade health sensors married with AI and algorithms will be able to sense developing cardiovascular disease, impending strokes, GI tract issues, liver function impairment or acute renal failure and even recommend or administer direct treatment that prevents a critical event while you seek more direct medical assistance. 

Insurance companies that offer health and life insurance are starting to understand that these tools will dramatically reduce their risk in underwriting policies, as well as help policyholders (that’s us) manage their health better in concert with medical professionals. Insurance will no longer be about assessing your potential risk of heart disease, as much as it will be about monitoring your lifestyle and biometric data so that the risk of heart disease can be managed. The paper application forms that you fill out for insurance policies today will be pretty much useless compared with the data insurers can get from these types of sensor arrays. Besides, an application form won’t be able to help you actively manage diet, physical activity, etc., to reduce the ongoing risk of heart disease. It’s why organisations like John Hancock, the US insurance giant, are already giving discounts to policyholders who wear fitness trackers.

With so much data being uploaded to the interwebs, every second of every day, we have already gone well beyond the point where humans can effectively analyse the volume and breadth of data the world collects without the use of other computers. This will also dramatically change the way we view diagnosis... [ibid, Kindle Locations 1402-1416].
All of our mobile, personalized (wearable and ingestible) health sensor data will be effortlessly uploaded into our docs' EHRs via APIs, right, where Watson and the rest of AI will largely take it from there, right? The Health Care Productivity Treadmill? Problem solved. More time for clinical "empathy"? Check. "Omics" driven personalized medicine everywhere? Check.

Right.

NEXT: THE FUTURE OF EDUCATION

Recently in our snailmail.


The latest data indicate that the average cost of attending a public college for an undergraduate degree is just shy of $30,000 a year, and roughly double that for private colleges. Add to that the cost of graduate education (including med school), you might easily amass several hundred thousand dollars worth of debt that will be difficult to eliminate, even assuming a steady job thereafter.
I count myself extremely lucky. I didn't accrue very much undergrad debt at all (paid off in short order), and I paid cash for my Master's.
Particularly in today's tech sector, a Master's degree (often augmented by one or more "professional/technical society" certifications) is becoming the floor norm for employment. Onerous student debt in pursuit of all of this has become a major issue, and is sure to become even more acute and pressing if significant employment opportunity contractions ensue as predicted in some quarters.

Beyond conventional two- and four-year (and graduate) degree programs, we also have a proliferation of "accelerated professional degree" and certificate programs. e.g., recall my post "A Master's in Health Policy and Law? UCSF/Hastings."

Recall also my reporting on Stanford's recent "certificate" offering in "Genomics/Personalized Medicine."
One concern I continue to have with a lot of these offerings goes to the relatively lax "admissions requirements" (e.g., a 3.0 GPA, a "letter of interest," and cumulative "life experience"). A principal criterion often seems to be a possession of a credit card with a valid expiration date and sufficient credit line. After I inquired of the UCSF/Hastings program, and notwithstanding that I'd made it quite clear why (simply to report on it for my KHIT blog), I promptly got handed off to an "Admissions Counselor" (read outsourced boiler room sales rep) who proceeded thereupon to badger me mercilessly with repeated phone calls and emails warning me of the looming "enrollment deadline." I finally had to get pissy with her to shut her down.
None of this stuff comes cheap. My CQE cert (ASQ Certified Quality Engineer) was a huge relative bargain at $180 (ASQ member exam price) when I first sat for the exam in 1992 (it's now $498 for non-members, $348 members, still a bargain). And, every Tom, Dick, & Harry tech society (e.g. HIMSS, AHIMA, IAPP, etc) administers a breadth of similar certification vetting exams via which you can add more non-degree alphabet soup letters after your name.

Also now we see a proliferation of online "MOOC" offerings (Massively Online Open Courses). Think Khan Academy, IHI Open School, etc., along with a huge patchwork of every imaginable ad hoc and institutional educational offering on YouTube (legit and scams alike).

Well, you could become a self-taught "SME" (Subject Matter Expert) relatively inexpensively via your own initiative given sufficient interest and effort. But, how would anyone come to know of your chops?

More on that shortly.

In reflecting on my numerous recent reads on genomics, it occurs to me that to truly become a SME in "omics" science I would need to back up and study/re-study the relevant basic sciences in detail, e.g.,
  • Physics 
  • Biophysics
  • General Chemistry
  • Organic Chemistry
  • Inorganic Chemistry
  • Physical Chemistry
  • Biochemistry
  • Radiochemistry
  • Biology
  • Evolutionary biology
  • Anthropology
  • Paleontology
  • Calculus
  • Statistics
etc. While I've had physics, calculus, and the gamut of stats in undergrad school (and advanced health care stats in grad school), I'd have to start at the beginning with most of the rest. High school chemistry and biology don't count.

VETTING THE "SME"

From the Wiki:
A subject-matter expert (SME) or domain expert is a person who is an authority in a particular area or topic. The term domain expert is frequently used in expert systems software development, and there the term always refers to the domain other than the software domain. A domain expert is a person with special knowledge or skills in a particular area of endeavour. (An accountant is an expert in the domain of accountancy, for example.) The development of accounting software requires knowledge in two different domains: accounting and software. Some of the development workers may be experts in one domain and not the other. A SME should also have basic knowledge of other technical subjects.

Function
In general, the term is used when developing materials (a book, an examination, a manual, etc.) about a topic, and expertise on the topic is needed by the personnel developing the material. For example, tests are often created by a team of psychometricians and a team of subject-matter experts. The psychometricians understand how to engineer a test while the subject-matter experts understand the actual content of the exam. Books, manuals, and technical documentation are developed by technical writers and instructional designers in conjunctions with SMEs. Technical communicators interview SMEs to extract information and convert it into a form suitable for the audience. SMEs are often required to sign off on the documents or training developed, checking it for technical accuracy. SMEs are also necessary for the development of training materials.
Yeah. I was a partner in an admission test-prep "exam-cram" A/V business in the mid-80's to early 90's (wrote about that experience here). My partner Mike (Michael K. Smith, PhD) remains in that line of work today, at TestPrepExperts.com. A great guy.

We put up posters with tear-off response cards all over campus.
I shot that pic in my studio control room and rendered the poster layout in Corel Draw.




Amid my studies of psychology and statistics I took "psychological tests and measures" while at UTK. I served as one of Mike's principal undergrad research assistants as he finished his psych Doctorate (focused on the psychology of "math anxiety"). Consequently I know a thing or two about valid psychometric assay construction (which I would later get to use, if informally, some years later in my role as an adjunct university faculty member).

The thought occurred: what if you could "MOOC" and then "comp" your way to a degree or a professional / technical cert via testing? At a tiny fraction of today's absurd cost of college or "professional certificate" programs?

apropos, I am reminded of the legendary con man Frank Abagnale (the "Catch Me If you Can" guy), who passed the Louisiana bar exam at the age of 19, never having been to either college or law school.
As I started my undergrad tenure at UTK, I took the CLEP exam ("College Level English Proficiency"). The national 99th percentile at the time was a score of 720. I blew it out, scoring a 765. Got me out of an entire year of Freshman English. Well worth it to me.
'eh? And, yeah, I know that, more broadly, there are other salient attributes factoring into job success that go beyond those revealed by strictly subject matter achievement exam results, but, such measures are solely what get you through school. You don't get vetted for EQ or motor skills, etc as part of graduation requirements.

SMEvetting.com

Startup idea. An "Uber For Your Mind"? LOL. Lots of underemployed 1099 "gig economy" SMEs out there these days. Think about it.
__

BREAKING NEWS UPDATE

Large, For-Profit ITT Tech Is Shutting Down All Of Its Campuses

The fall semester has just begun on most college campuses, but tens of thousands of students in 38 states were told Tuesday that, instead, their college is closing its doors.

In a press release, ITT Educational Services announced it would close all campuses of its ITT Technical Institutes. The for-profit college system has become a household name over the past half-century. The company blamed the shutdown on the U.S. Department of Education, which had stepped up oversight of the school and recently imposed tough financial sanctions...
I've never cared for this for-profit trade school thing. The entire business model is predicated on federally backed student loans. Excessive costs, low graduation rates, low job placement rates, etc. Two words: "Corporate Welfare."

Regarding current politics of for-profit education:
A For-Profit College Company Paid Bill Clinton Millions of Dollars. Is That a Scandal?

Laureate Education is the world's largest for-profit college operator, a behemoth that enrolls more than 1 million students, online and on campus, at 87 schools across 28 different countries. And, as the Washington Post recounts in a long feature this week, the company paid former President Bill Clinton some $17.6 million over five years to act as its “honorary chancellor”—a job that mostly involved letting the company refer to him as its “honorary chancellor.” Given the for-profit education sector’s rotten track record here in the United States, this naturally raises some questions, such as: Exactly what kind of an enterprise is Laureate? Is there any sign it was trying to buy favors from Hillary Clinton while she was secretary of state? Is there some hint of a scandal here, or any other reason to be annoyed by this particular thread in the Clintons’ complicated web of financial and philanthropic ties?

The short answer is that, no, there is no real scandal in this story, but yes, there are some reasons to be irritated—at least if you prefer your future presidents to enter office relatively free of personal financial ties to troublesome industries...
__

EDUCATION ERRATUM

apropos of the foregoing, just posted at Medium.
Is online learning the future of education?

In the past, if you wanted to get a qualification, or even simply learn something new, you would sign up for a course at a bricks-and-mortar institution, pay any relevant fees, and then physically attend class. That was until the online learning revolution started...
Polling cited in the article indicates that a plurality of respondents don't think online learning is as effective as the "chalk talk" classroom method (47.68% vs 40.56%, with 11.76% unsure). Again, until we have widespread methodologically sound post-curriculum vetting in place, we're not really gonna know.

SEPT 16TH UPDATE
Medical schools have competition from other teaching tools. It’s time they acknowledge it
By ANU ATLURU, SEPTEMBER 16, 2016

Learning and retaining medical knowledge is a greater challenge for students today than ever before. Some see this as an opportunity. Others see that business opportunity as a threat to established medical education.


In 1950, the doubling time of medical knowledge was estimated to be 50 years. By 2010 that had shrunk to 3.5 years and by 2020 it is projected to be just 73 days. With each passing year, medical students are required to learn more material than did their predecessors. Further adding to the burden is the increasing importance of standardized examinations, particularly the formidable United States Medical Licensing Exam (USMLE) Step 1. An eight-hour-long endeavor, Step 1 tests mastery of the basic sciences, also known as the preclinical curriculum. It is widely considered to be the single most important examination for placement into residency programs after medical school.

Are schools providing the tools to help medical students isolate and retain essential information, and so excel on licensing exams? If students’ study habits are any indication, the answer seems to be no. Rather, the sheer volume and complexity of tested material coupled with the high stakes of these exams has allowed third parties to flourish and capture students’ time, attention, and money...
'eh?
____________

More to come...

No comments:

Post a Comment