Search the KHIT Blog

Monday, November 27, 2017

From Pythagoras to petabytes

A few days back, I was surfing through a number of news articles recounting Elon Musk's latest freakout over The Existential Menace of AI. See, e.g., "AI is highly likely to destroy humans, Elon Musk warns."


A commenter beneath one of the articles (TechCrunch's "WTF is AI?") briefly cited this book.


OK, Elon's pearl-clutching will have to wait.

Garth's book is merely $4.61 Kindle price (and worth every penny). A must-read, IMO. Spanning, well, "Pythagoras to petabytes." (I'd have chosen "yottabytes," but it didn't have the cutesy alliterative post title ring.)
IT’S EASY TO FORGET that the digital age, and so the existence of computer programmers, still only spans a single working life, and one of those lives is mine. In my career as a commercial computer programmer I’ve experienced most of the changes that have occurred in the programming world in all their frequent craziness. This is my attempt to preserve that odyssey for the historical record...
For myself and many others, commercial computer programming has been a rewarding career since at least the late 1960s and during that time programming has seen massive changes in the way programmers work. The evolution of many different programming models and methodologies, the involvement of many extraordinary characters, endless ‘religious wars’ about the correct way to program and which standards to use. It would be a great shame if a new generation of programmers were unaware of the fascinating history of their profession, or hobby, and so I came to write this book...
Four of the overall themes in the book can be identified by searching for #UI (User Interface), #AI (Artificial Intelligence), #SOA (Service Oriented Architecture) and #PZ (Program componenti-Zation) and the Table of Contents is detailed enough to guide dipping in/ dipping out…
___
In Years BC (Before Computers)
Sometimes it seems like computers and the digital revolution sprang out of nowhere but of course they didn’t. They were the result of at least two and a half millennia of evolutionary development which can be traced all the way back to the ancient Greek philosopher Pythagoras. When I graduated from university in 1968 I had a degree in philosophy but I somehow managed to be awarded it without having absorbed much ancient Greek philosophy or the history and philosophy of science. There were just too many well documented distractions in the sixties. It was only later that I came to realise how critical they were in creating the context for digital computers and with them the new career of commercial computer programmer.

So we need to briefly cover some history in order to set the scene for my account though only the necessary facts will be covered. We’ll only be scratching the surface so don’t panic but if you prefer to skip this Prologue that’s OK. By the way I use BC and AD for dates and not BCE and CE, there is no particular reason, it’s just habit.

550 – 500 BC
During this period Pythagoras the ancient Greek philosopher was active in a Greek colony in southern Italy. No written record of his thinking survives from the time but he is known to have influenced Plato and Aristotle and through them all of Western thinking. He is rumored to have been the first person to ever call himself a philosopher…

Eaglesfield, Garth . The Programmer's Odyssey: A Journey Through The Digital Age (Kindle Locations 94-129). Pronoun. Kindle Edition.
Garth and I are roughly the same age (both born in 1946). He came to computing about 12 years before I did. My initiation came tangentially in my 30's amid the course of my undergrad research (I was studying Psychology and Statistics at Tennessee; we were investigating "math anxiety" among college students taking stats courses). We wrote SAS and SPSS code via primitive line editors (e.g., edlin and vi), with laboriously entered inline research results data, and prefaced with JCL headers (Job Control Language), all submitted to the DEC/VAX over a 300 baud dialup modem (after which we'd schlep over to the computer room to fetch the 132-column greenbar printouts once available).

My, my, how times have changed.

After graduation in 1985, I got my first white collar gig in January 1986, writing code in a radiation laboratory in Oak Ridge (pdf). After those years, my time was spent principally as an analyst, using mostly SAS and Stata, in health care (pdf) and bank risk management.

I found this particularly interesting in Garth's book:
On The Road To Bell Labs
I felt that I needed to get Unix and C on to my CV and where better to do that than at its birthplace where it all started? Bell Labs was in New Jersey on the other side of the Hudson river and would involve some travel but I knew that other contractors from Manhattan worked there so I reasoned that there must be some way of coping. The Holy Grail of the various Bell Labs, which were scattered around New Jersey, was the Murray Hills lab, it was the real Unix/ C holy of holies. But it was hard to get into and so I eventually got a contract at the Whippany lab. It was closer to Manhattan and was more of a traditional telephone company engineering lab but they used Unix and C so that would get it on to my CV. The project I worked on was the software for a diagnostic device used in network line testing…

At the Labs I was given a copy of the now legendary ‘White Book’ written by Kernighan and Ritchie accompanied by a Unix manual and I set about absorbing them both.

Entering your C source code for a program was done with the ubiquitous Unix editor vi from a dumb terminal. When vi was developed at Bell Labs the quest for portability had been continued by using ordinary characters for edit commands so that it could be used with any dumb terminal’s keyboard. Those keyboards of course did not have all the control keys that modern keyboards come with such as arrow keys, Home, End, Page Up, Page Down, and so on. As with everything in the Unix world the vi key combinations ranged from simple and intuitive to complex and barely comprehensible. It operated in 2 modes, insert and command modes, and in command mode, for instance, it used the normal keyboard characters ‘h’ ‘l’ ‘k’ and ‘j’ as substitutes for the absent left, right, up and down arrow keys.

Compared to the VMS operating system and to high level programming languages like COBOL, Coral 66 and Pascal, Unix and C certainly represented a much lower level, closer to the hardware and more like assembler coding. There were relatively few reserved words in the C language with the main ones being for, data classes (int, char, double, float), data types (signed, unsigned, short, long), looping verbs (for, while, continue, break), decision verbs (if, else, switch, case) and control verbs (goto, return). C took the componentization of computer programs several steps further through the use of functions, a specific form of subroutine. Functions, returned a single value that might then be part of an equation...
(ibid, Kindle Locations 1677-1706).
My late Dad came back from WWII (minus a leg), mustered out of the military, and went to work for Bell Labs, first at Murray Hill, and subsequently at the Whippany location. He worked in semiconductor R&D his entire career at Bell Labs (the only civilian job he ever had, until he took his tax-free early VA disability pension in 1972 and dragged my Ma off to the humid swamps of Palm Bay, Florida so he could play golf all the time.

Looks like Garth came to Bell Labs a few years after Pop had retired.

"The Programmer's Odyssey" covers a ton of ground, weaving great historical and technical detail into a fun read of a personal story. I give it an enthusiastic 5 stars.
"It would be a great shame if a new generation of programmers were unaware of the fascinating history of their profession, or hobby, and so I came to write this book."
Yeah. I am reminded of my prior post "12 weeks, 1,200 hours, and $12,000, and you're a "Software Engineer"? Really?"

In closing,
Postscript
We have reached the end of the odyssey. But where have we arrived?

Well in some ways, like Odysseus himself, we have finally come home, right back to where we started. Some things have changed, we now have a massive multi-layered global network of increasingly smart devices, including the ultimate smart device the human being. Yet ultimately it’s all based on the same old Von Neuman computer architecture and on the two digits 0 and 1, the binary system. The whole extraordinary software edifice rests on that foundation, which is really mindboggling.

Have we learned anything on our odyssey that suggests this is likely to change? Perhaps a new computer hardware architecture will finally emerge? Articles regularly appear about quantum computing devices and IBM have announced a project to research cognitive SyNAPSE chips but so far these are still very much research projects. It is also being said that the minituarisation of silicon chips by Intel and others will reach its limit by 2020 and Moore’s Law will finally come to an end.

On our odyssey we, like Odysseus, have encountered dangerous and worrying phenomena, in particular whether any conceivable developments in computer technology will put the human race in danger. It’s hard to believe that based on existing system components a self-conscious sentient being will come into existence like HAL in 2001 and threaten our own survival. Throughout history major technological advances have tended to become models for explaining human beings, always incorrectly. In the digital age this has meant starting to see human beings as just computing machines. Will this prove to be true or false? The jury is still out.

But if we are in danger from our technology perhaps it’s most likely to come from an artificial living creature/ cyborg/ replicant enhanced with implanted computer chips and interfaces that we have created? Created perhaps by using advanced tools such as the already being discussed DNA editors? The recent TV series ‘Humans’ is an interesting attempt to confront some of these issues and two recent books have added significantly to the debate; Rise of the Machines: The Lost History of Cybernetics; Thomas Rid; Scribe Publications and Homo Deus: A Brief History of Tomorrow, by Yuval Noah Harari, Harvill Secker.

Questions, questions. As the philosophically inclined outlaw replicant Roy Batty so presciently remarked in Blade Runner
(ibid, Kindle Locations 2646-2666)
'eh?


Kudos, Mr. Eaglesfield. I'd make this book required reading at "software engineer boot camp." (BTW, Garth's website is here.)

Also, Garth turned me on to this resource where he blogs:


Check out in particular the "History of Computing and Programming" page.
___

Other prior posts of mine of relevance here?
Then there are my posts on "NLP" (Natural Language Processing).

NEXT UP

Got a new Twitter follower.

The Web's Best Content Extractor
Diffbot works without rules or training. There's no better way to extract data from web pages. See how Diffbot stacks up to other content extraction methods:
Pretty interesting. Stay tuned. We'll see.

SAVE THE DATE

They just approved my press pass. Follow hashtag #health2con and Twitter handle @Health2con.

Details
UPDATE

From "history" to "futurism" predictions. I saw this infographic on Pinterest, and chopped it up into 4 segments for viewability.


 See also "Types of AI."

UPDATE

Garth just reciprocated my foregoing tout with the kindest mention of me. See "Cometh the hour, cometh the man."

Again, not kidding about how cool his book is.
____________

More to come...

No comments:

Post a Comment