Powered By Blogger

Friday, October 2, 2015

The Story of Technology

There is even a history behind your laptop.

We’re taking a diversion from the usual menu of Sutton history articles this month. Why? Because it is dealer’s choice and it’s a good story.

A few of us in the Sutton area had the good fortune of meeting “The Computer” years ago and have watched the progress from rooms filled with equipment and the humming of dozens of cooling fans to today’s laptops, notebooks and hand-held devices with keys/buttons a fraction of the size of my fingertips.

But that was not the really early days. Let’s go way back.

There had to be a “first” computer. What was it?
This model of the Babbage Difference Engine is in the London Science Museum. It was built from Charles Babbage's
design in 1822. The idea behind our computers will soon be 200 years old.

One candidate was a mechanism for programing the operation of a loom. It had “instructions”
“encoded” on a card “read” by the loom directing all that motion we see happening in a loom. That system had all the earmarks of a computer – stored instructions on the card, a mechanism to read and retrieve the instructions and mechanical pieces to perform complex repetitive functions.

An Englishman Charles Babbage is called the “father of the computer” as he was the first to envision a machine to do math calculations, in 1822 – way back. Babbage was working alone when an acquaintance joined him. Ada Lovelace was the daughter of the poet Lord Byron. Ada had an interest in math and logic and found Babbage’s work fascinating.
Lord Byron's daughter, the Countess of Lovelace was a mathemetician
who likely created the first algorithm - the first computer program. Ada
Lovelace was born December 15, 1815 and died in 1852. This is an 1840
portrait. Are you ready to celebrate her 200th birthday soon?


Ada created a library of Babbage’s notes and organized them into steps that could be performed by Babbage’s machine to solve mathematical problems. Those steps were an algorithm, what we later called a “program” – yes, the Countess of Lovelace was the first computer programmer.

The Department of Defense uses a programming language called Ada – I maintained a couple of programs in Ada – a language related to Pascal.

Let’s jump to 1890.

The 1880 census had gathered a lot of data. Huge teams were counting, adding, categorizing and otherwise analyzing the data from the census. There were fears that censes analysis would take more than ten years and not be done before the next census.

Herman Hollerith invented a tabulating machine and better yet, a card to hold information.

Census information was encoded on the card in columns of holes representing numbers or letters. The tabulating machine read stacks of these cards adding up the holes quickly finding how many of each category of information had been found on all those people. Genius.

Hollerith’s card was standardized by IBM in 1928 to an 80 column format. Hollerith had used trays that held currency so he made the cards the size of a dollar bill in 1887.

The IBM card was ubiquitous in its day. This card illustrates the hole punches that represented letters, numbers and special
characters. The standard card punch machine was the IBM 029 - oh, the memories. 
Many early computers had specific purposes – the loom control system is an example. Flight control systems on airplanes is another, as is the computer in your car. You can’t do anything else with it.

Engineers developed general purpose computers starting in the 40’s to do a variety of tasks, often simultaneously. War is a great motivator for society and our mid-20th century wars pushed computer technology a lot.

These general purpose machines were called mainframes consisting of many cabinets of equipment filling a room with whirring fans and disks and spinning tape drives and a whole staff of specialists to make it all work.

There were several serious competing manufacturers of computer hardware: Burroughs, NCR, Control Data, Honeywell, General Electric, RCA and of course, IBM. Inevitably, there was consolidation. (At SAC headquarters we used a Honeywell 6080 with a General Electric operating system to support planning for all aircraft and missiles in the nuclear war plan.)

The nature of software did not come easily to many. I remember trying to explain it to my father. After some false starts I used the analogy of the record player. The player was the hardware and the records were software – not good enough. A record is still a touchy-feely thing. I then tried saying that the sounds, the music was the software. Maybe better but any analogy works well until it doesn’t.

Progress to develop our computers came on many fronts. Think about calculators. Our museum has an early desktop mechanical calculator, a noisy, clunking machine with rows of buttons; a great device in its day. Digital calculators used a small processor (computer) illustrating the transition to automate functions. Soon there were spreadsheets on general purpose computers. It happened to all kinds of tasks that had been tedious and labor intensive. Good stuff.

Computers were bright, shiny objects for our popular culture.

One popular 1960’s TV quiz show featured a big complicated-looking thing on stage that “selected” the questions for contestants. The host would push a button, lights would flash, music played and IBM cards would be shuffled out into slots.

Mainframes were large and expensive. Even imaginative futurists were predicting only governments and large corporations would ever use these things. But every development trend led to smaller footprints, cheaper materials and manufacturing process and wide accessibility. Ever heard of Moore’s Law?

Gordon Moore was a co-founder of Intel and in 1965 he observed that the density of transistors on integrated circuit boards was doubling every two years. That meant that computer technology was getting twice as good and half as expensive every two years. Moore predicted that rate could be sustained for the next decade. It’s kind of leveled off in just the last three years. That’s why your laptop exists.

In the 1970’s another herd of manufacturers rode Moore’s Law into a personal computer frenzy. Who can forget the Commodore 64, the Osborne 1, TI-99, Radio Shack’s TRS-80, known as the Trash-80, and many more?

The very first personal computers came in kits. The Altair 8800 appeared as early as 1975; Apple’s first product was a kit for the Apple 1. And there was the Heathkit H-89. Now there was a machine.
Retrieved from the bottom shelf of the storm cellar, my
Heathkit H-89 computer built in 1979 shown here with the
original manuals. Nostalgia is almost painful.


A clever, or devious mail-order school in Los Angeles set up a four-part micro computing correspondence course which qualified for the GI bill. Many active duty people took this course in which the fourth part brought the kit for the H-89 desktop computer. So late in 1979 I had my first desktop computer.

About the same time, the Big Guys jumped in. The Apple II and the first IBM-PC were released – similar to the competition but with corporate power behind them.

Another example of the computer’s attractiveness to the popular culture was the Apple ad to introduce the Macintosh computer during the 1984 Super Bowl. It is listed among the best-ever commercials though the company followed it up with one of the worst ever at the next Super Bowl.

If Ada Lovelace was the most famous woman in the earliest period of technology, then the most famous modern day woman in the field was building her reputation about this time.

Grace Hopper was one of the first programmers of the Mark 1 computer at Harvard University. She created the first compiler for a computer programming language and was involved in the development of the COBOL programming language. She also invented the term “debugging” for fixing computer problems when she once removed a moth from a computer.

Grace enlisted in the Navy in 1944 at the age of 37 and served for 43 years attaining the rank of Rear Admiral. She had a small programming team in her early career where she developed a management philosophy based on the advice that, “It is much better to apologize than it is to get permission.”

She was a public relations treasure for the Navy - I heard her speak four times – mostly the same speech.

I recommend a ten minute video of Admiral Hopper’s appearance on David Letterman’s show;  https://www.youtube.com/watch?v=1-vcErOPofQ    

Our deviation from "normal" Sutton history is worth it just to introduce Amazing Grace to any who do not know about her. Do yourself a great favor and research the story of Admiral Grace Hopper, TPE (Technology Pioneer Extraordinaire).
The early mainframe computers evolved into powerful behemoths and those first personal computers evolved into small, but powerful behemoths. So what is the difference?

Most users today are using desktop and laptop computers with little or no appreciation of what the nature of the mainframes. I’ll illustrate with a system I worked with at a large grocery and drug store business in the ‘90’s.

We had a mainframe system in Dublin, California – it filled a room of 10 to 12,000 square feet with a staff of dozens of operators, about 100 programmers and a hundred or more other support folks including myself with a data security/disaster recovery group of six.

The company had more than 2,600 stores from California to New England, 43 warehouses, eight or ten major office complexes and more than 250,000 employees. Many people, probably more than ten thousand had either computers on their desktops or terminals with no processing capability. In either case, all were connected to the mainframe where nearly all processing was conducted and all company data was stored. The mainframe handled all that work.

There were many other devices connected to the system. Warehouse fork lift operators had a “terminal” on the fender of the lift where they were connected to a mainframe program that directed what merchandise was to be moved where. That’s dozens of fork lifts in each of 43 warehouses, many moving 24-hours a day.

Your desktop computer can’t do that.

I hear another question out there: “Where was the internet?”

The internet was deployed in 1993 after several years of development by major universities and the Department of Defense. And no, Al Gore did not invent the internet. But we have to honestly say that he likely had more to do with its development than most geeks working on it.

Senator Gore introduced the Supercomputer Network Study Act of 1986 which directed a flurry of activity and funded many of the efforts to develop the network. Gore’s interest in a network began when he was a house representative in the early 1970’s when he began to nag his colleagues on the topic, for a long time a single voice on the topic.

I began to use the internet well before there was a world wide web. There were a few bulletin boards across the country; I subscribed to one in Cambridge, MA and one called The Well in Northern California. I was living in Omaha. It was a long distance call (non-trivia costs then) for a dial-up connection at 300 baud. The meter was running.

The procedure was to sign on, download any of your messages (I don’t think we called it email) or search for documents you wanted to read, download them, sign off as soon as possible and read with the phone disconnected. We’d compose all our messages and line-up any documents we wanted to share, dial-up again and upload those messages and get off. And it was great. We were riding the advanced wave of the future and we knew it.

Several enterprises set up access to the internet by simply providing local phone numbers cutting that connection cost. AOL had a number in Omaha; CompuServe and Prodigy were not far behind.

Computer and networking technology has progressed rapidly for more than 40 years. It is not a real new thing, it is mature. But I’ve shown here that the beginnings were way, way before that – almost 200 years ago.

And finally, I am irked when I hear someone say they don’t use a computer, saying or implying that that is something for younger folk. I left the west coast ten years ago where older people had been naturally living in a high-tech world for some time. People in retirement homes were not only active email users, many had built their own web sites and were creating online content that was very good. The early bloggings support sites were beginning and older people were jumping into that world too.

The inclination and willingness to participate in new technology is not an age-related thing. It is much more a geography thing.




No comments: