A History of Computing

A Short History of Computing:

About 2,500 years ago, the Indian scholar Panini described how our speech had structure. Today we call it grammar. Many of us were taught grammar in school by diagramming sentences, like the diagram below. For a long time after that, not much further thought was given to symbols we use like words, phrases, sentences, numbers, etc.

But we did need to manipulate numbers. At first we didn't even have symbols for numbers, so we just used tally marks on sticks or clay tablets and later just used letters of the alphabet for numbers. We had poor ways of writing numbers that were not much better than tally marks, like Roman numerals, where 2018 is MMXVIII

Finally, most people adopted a system from the Hindus that we use today, with symbols for 1 through 9 and arrangements in columns like 123. The Arabs spread this across North Africa to Europe about the year 1200, as seen by the famous mathematician Fibonacci replacing Roman numerals with Arabic numbers. [Can you imagine doing your checkbook using Roman numerals ? Thank Fibi.]

Our use of numbers increased, especially as we attempted to navigate and sail around the globe. But most people were illiterate and even fewer could do arithmetic beyond counting chickens and eggs. People even had trouble just adding bigger numbers, so Blaise Pascal, in the 1600s, made a simple calculator out of handmade gears. These were improved, but still no multiplication and certainly no division !

Although most people could not do anything beyond simple numbers, there were a few professional mathematicians who made some calculations in advance and printed the results in books, then carried on ships for navigation. These people were called computers! As some arithmetic was still required and multiplication often had many mistakes, errors were reduced by using logarithms introduced by John Napier about 1600, also printed and carried on board. BTW, Napier also introduced the decimal point we now use in numbers.

Calculators were getting better than Pascal’s first machine, but multiplication and division were beyond most people. An early more advanced machine was attempted by Charles Babbage in the 1800s. Lady Ada Augusta Lovelace, Byron’s daughter and a mathematician, programmed Babbage’s machine and is considered the first computer programmer. She also was the first to realize that such a machine could manipulate any symbols, not merely numbers !! Unforunately Babbage’s machine was never finished as the Crown cut off his funds. Ada Lovelace died in her 30s. Here is Ada:

The attempt to go beyond Pascal’s calculator continued in the 1900s. Instead of gears, relays were used for more speed [and noise!]. But soon, relays were replaced by vacuum tubes [valves] and numbers could be multiplied in a few seconds - after hours of setup!! Famous people were involved at this point: Alan Turing, John Von Neumann, Claude Shannon, Alonzo Church, Maurice Wilkes also early hardware builders like  J. Presper Eckert and John Mauchly. Here is an early circuit and Alan Turing.

By the 1930s and 1940s we begin to have what we would recongize as computers today. Crude but functional. During WWII, These machines became giant, weighed tons and occupied entire buildings. Later some smaller ones were built and sold to the public - usually government and large corporations. Several manufacturers produced mainframe computers from the late 1950s through the 1970s. The group of manufacturers was often referred to as "Snow White and the Seven Dwarfs". In that context, Snow White was IBM and the seven dwarfs were Burroughs, UNIVAC, NCR, Control Data Corporation, Honeywell, General Electric and RCA. 

I worked on the first supercomputer in the 1960s, the CDC [Control Data Corp]  6600. It was the first machine to do over 1 million mathematical instruction per second, called MIPS, a milestone machine. 

[The man in the suit above was my boss Charles Warlick, Director of the Computation Center of the University of Texas at Austin.]

Soon however, electronics became smaller with many circuit elements on a single chip, called integrated circuits in the 1970s. And shortly, large scale integrated circuits, LSICs, became complex enough for a complete computer, called microcomputers. Here is the 6502 microprocessor used in the first PC - the Apple.

Microcomputers reduced cost so rapidly that most mainframe computers disappeared in the 1980s and the PC - personal computer, appeared on many desktops. Miniaturization continued, and by 2008 the iPhone - the first smartphone appeared. My supercomputer cost $8.5 million in 1966, with 30 of us to keep it running. 

Today’s smartphone [2018] costs less than $1,000, is thousands of times faster than my first supercompuiter, runs on batteries and goes in your pocket - plus, it’s a phone ! - and a network device and a GPS receiver !!!


© Gareth Harris 2018        -         Contact email: GarethHarris@mac.com         -         see also: SentimentalStargazer.com