A History of Computing


A Short History of Computing:




Language / Grammar: About 2,500 years ago, the Indian scholar Panini described how our speech had structure. Today we call it grammar. Many of us were taught grammar in school by diagramming sentences, like the diagram below. For a long time after that, not much further thought was given to symbols we use like words, phrases, sentences, numbers, etc.









Numbers: But we did need to manipulate numbers. At first we didn't even have symbols for numbers, so we just used tally marks on sticks or clay tablets and later just used letters of the alphabet for numbers. We had poor ways of writing numbers that were not much better than tally marks, like Roman numerals, where 2018 is MMXVIII




Arabic / Hindu Numbers: Finally, most people adopted a system from the Hindus that we use today, with symbols for 1 through 9 and columns of numbers like 12345. These methods were brought from India to the west by the famous Al-Khwarizmi, a Persian who lived in Baghdad about the year 800, from whom we get algebra and algorithms. These numbers were next brought to Europe about the year 1200 by the famous mathematician Fibonacci, replacing Roman numerals with Arabic/Hindu numbers. [Can you imagine doing your checkbook using Roman numerals ? Thank Fibi.]



Calculators: Our use of numbers increased, especially when we attempted to navigate and sail around the globe. But most people were illiterate and even fewer could do arithmetic beyond counting chickens and eggs. People even had trouble just adding bigger numbers, so Blaise Pascal, in the 1600s, made a simple calculator out of handmade gears. These were improved, but still no multiplication and certainly no division !








Bowditch - the famous book found on all ships: Although most people could not do anything beyond simple numbers, there were a few professional mathematicians who made some calculations in advance and printed the results in books, then carried on ships for navigation. These people were called computers! As some arithmetic was still required and multiplication often had many mistakes, errors were reduced by using logarithms introduced by John Napier about 1600, also printed and carried on board. BTW, Napier also introduced the decimal point we now use in numbers.







Better Calculators: Calculators were getting better than Pascal’s first machine, but multiplication and division were beyond most people. An early more advanced machine was attempted by Charles Babbage in the 1800s. Lady Ada Augusta Lovelace, Byron’s daughter and a mathematician, programmed Babbage’s machine and is considered the first computer programmer. She also was one of the first to realize that such a machine could manipulate any symbols, not merely numbers! Unforunately Babbage’s machine was never finished as the Crown cut off his funds. Ada Lovelace died in her 30s. Here is Ada:








Computers!: The attempt to go beyond Pascal’s calculator continued in the 1900s. Instead of gears, relays were used for more speed [and noise!]. But soon, relays were replaced by vacuum tubes [valves] and numbers could be multiplied in a few seconds - after hours of setup!! Famous people were involved at this point: Alan Turing, John Von Neumann, Claude Shannon, Alonzo Church, Maurice Wilkes also early hardware builders like  J. Presper Eckert and John Mauchly. Here is an early circuit and Alan Turing.



Big Computers: By the 1930s and 1940s we begin to have what we would recongize as computers today. Crude but functional. During WWII, These machines became giant, weighed tons and occupied entire buildings. Later some smaller ones were built and sold to the public - usually government and large corporations. Several manufacturers produced mainframe computers from the late 1950s through the 1970s. That group of manufacturers was often referred to as "Snow White and the Seven Dwarfs". In that context, Snow White was IBM and the seven dwarfs were Burroughs, UNIVAC, NCR, Control Data Corporation, Honeywell, General Electric and RCA. 



Supercomputers: I worked on the first supercomputer in the 1960s, the CDC [Control Data Corp]  6600. It was the first machine to do over 1 million mathematical instruction per second, called MIPS, a milestone machine. 

[The man in the suit above was my boss Charles Warlick, Director of the Computation Center of the University of Texas at Austin.]


Microcomputers: Soon however, electronics became smaller with many circuit elements on a single chip, called integrated circuits in the 1970s. And shortly, large scale integrated circuits, LSICs, became complex enough for a complete computer, called microcomputers. Here is a magnification of the 6502 microprocessor used in the first PC - the Apple - wikipedia linkThe first 6502 chips were 168 x 183 mils (4.3 x 4.7 mm), had 4237 transistors, could perform 56 instructions and cost $25 US dollars in 1974 .


Personal Computers - PCs: Microcomputers reduced cost so rapidly that most mainframe computers disappeared in the 1980s and the PC - personal computer, appeared on many desktops. Miniaturization continued, and by 2008 the iPhone - the first smartphone appeared. My supercomputer cost $8.5 million in 1966, with 30 of us to keep it running. 







iPhones: Today’s smartphone [2018] costs less than $1,000, is thousands of times faster than my first supercompuiter, runs on batteries and goes in your pocket - plus, it’s a phone ! - and a network device and a GPS receiver !!!




The iphone appeared in 2008. Today is 2018. It changed the entire world - in 10 years !!!


______________________________________________

Raspberry Pi

PS 2018-06-07: Here is a complete UNIX [linux] computer today,
about the size of a deck of cards and costing $35.
The Raspberry Pi 3B+ see:
raspberrypi.org

RaspberryPi3Bside



RaspberryPi3B


© Gareth Harris 2018        -         Contact email: GarethHarris@mac.com         -         see also: SentimentalStargazer.com