ThinkingTalkingComputing

WIP work in progress 2018-03-16

Research about Thinking, Talking, Computing


Here is an excellent intro from The Atlantic [opens in a new window]

How Aristotle Created the Computer

______________________________________________

And now, my short intro:

A Short History of Computing:




About 2,500 years ago, the Indian scholar Panini described how our speech had structure. Today we call it grammar. Many of us were taught grammar in school by diagramming sentences, like the diagram below. For a long time after that, not much further thought was given to symbols we use like words, phrases, sentences, numbers, etc.









But we did need to manipulate numbers. At first we didn't even have symbols for numbers, so we just used tally marks on sticks or clay tablets and later just used letters of the alphabet for numbers. We had poor ways of writing numbers that were not much better than tally marks, like Roman numerals, where 2018 is MMXVIII




Finally, most people adopted a system from the Hindus that we use today, with symbols for 1 through 9 and arrangements in columns like 123. The Arabs spread this across North Africa to Europe about the year 1200, as seen by the famous mathematician Fibonacci replacing Roman numerals with Arabic numbers. [Can you imagine doing your checkbook using Roman numerals ? Thank Fibi.]



Our use of numbers increased, especially as we attempted to navigate and sail around the globe. But most people were illiterate and even fewer could do arithmetic beyond counting chickens and eggs. People even had trouble just adding bigger numbers, so Blaise Pascal, in the 1600s, made a simple calculator out of handmade gears. These were improved, but still no multiplication and certainly no division !








Although most people could not do anything beyond simple numbers, there were a few professional mathematicians who made some calculations in advance and printed the results in books, then carried on ships. These people were called computers! As some arithmetic was still required and multiplication often had many mistakes, errors were reduced by using logarithms introduced by John Napier about 1600, also printed and carried on board. BTW, Napier also introduced the decimal point we now use in numbers.







Calculators were getting better than Pascal’s first machine, but multiplication and division were beyond most people. An early more advanced machine was attempted by Charles Babbage in the 1800s. Lady Ada Augusta Lovelace, Byron’s daughter and a mathematician, programmed Babbage’s machine and is considered the first computer programmer. She also was the first to realize that such a machine could manipulate any symbols, not merely numbers !! Unforunately Babbage’s machine was never finished as the Crown cut off his funds. Ada Lovelace died in her 30s. Here is Ada:








The attempt to go beyond Pascal’s calculator continued in the 1900s. Instead of gears, relays were used for more speed [and noise!]. But soon, relays were replaced by vacuum tubes [valves] and numbers could be multiplied in a few seconds - after hours of setup!! Famous people were involved at this point: Alan Turing, John Von Neumann, Claude Shannon, Alonzo Church, Maurice Wilkes also early hardware builders like  J. Presper Eckert and John Mauchly. Here is an early circuit and Alan Turing.



By the 1930s and 1940s we begin to have what we would recongize as computers today. Crude but functional. During WWII, These machines became giant, weighed tons and occupied entire buildings. Later some smaller ones were built and sold to the public - usually government and large corporations. Several manufacturers produced mainframe computers from the late 1950s through the 1970s. The group of manufacturers was often referred to as "Snow White and the Seven Dwarfs". In that context, Snow White was IBM and the seven dwarfs were Burroughs, UNIVAC, NCR, Control Data Corporation, Honeywell, General Electric and RCA. 



I worked on the first supercomputer in the 1960s, the CDC [Control Data Corp]  6600. It was the first machine to do over 1 million mathematical instruction per second, called MIPS, a milestone machine. 

[The man in the suit above was my boss Charles Warlick, Director of the Computation Center of the University of Texas at Austin.]


Soon however, electronics became smaller with many circuit elements on a single chip, called integrated circuits in the 1970s. And shortly, large scale integrated circuits, LSICs, became complex enough for a complete computer, called microcomputers. Here is the 6502 microprocessor used in the first PC - the Apple.


Microcomputers reduced cost so rapidly that most mainframe computers disappeared in the 1980s and the PC - personal computer, appeared on many desktops. Miniaturization continued, and by 2008 the iPhone - the first smartphone appeared. My supercomputer cost $8.5 million in 1966, with 30 of us to keep it running. 







Today’s smartphone [2018] costs less than $1,000, is thousands of times faster than my first supercompuiter, runs on batteries and goes in your pocket - plus, it’s a phone ! - and a network device and a GPS receiver !!!




______________________________________________

NEXT:

Clues:
Closing the loop from Thinking to Talking to Computing

First, about clues: 

An example clue: In the 1930s Enrico Fermi escaped the Nazis in Rome, taking with him an experiment which exhibited strange behavior when moved from a marble table in Rome to a wooden table in Cambridge. It was a clue to slow neutrons and the secret to nuclear chain reactions. It took someone as brilliant as Fermi to understand what the clue meant.


Consider:

Our universe is full of processes - procedures: 

Some processes come from nature:
stars make atoms, atoms make molecules, cells and us - you and me.

Other processes come from us: 
children make sentences, mothers bake cookies, factories make cars. 

All of us can do some procedures: 
most children learn to make sentences, some do arithmetic, then calculus.

Other procedures require more work and experience.
Starting about age 8 in my father’s machine shop after WWII, I swept floors,  
Then later built machines by learning from the experience of others. 

From there I went to physics and then computing. 


In summary:
Nature assembles matter into 
concrete machines, that we can touch:
 
        atoms, molecules, compounds - cars, trucks, planes - trees, whales, us

Life also makes abstract machines that we cannot see and touch: 
We find those 
mainly in our heads and in our languages and computers, although we sometimes observe them in group behavior of plants, animals and humans.

For example, Inside our heads are thoughts: abstract things often appearing as symbols, images, tokens or memes - making words, phrases, sentences, etc. Our thoughts often interact with each other and even, communicating through language, with the thoughts of others.

Sentences are thought structures - that we know how to build and recognize. How we learn this as children has been discussed for thousands of years. Current thinking, led by Noam Chomsky, is that the language mechanism is not learned, but is innate in humans, developing in early childhood into particular languages, growing just like arms and legs. Other linguistic researchers, like  Dan Everett, pursue the gradual acquisition of language, beginning 1,500,000 years ago with homo erectus.

We now build computers, machines that manipulate symbols as Ada pointed out, but not merely numbers but also letters, words, sentences - now also even sounds and images.

These symbol manipulators, our heads and our computers,
                                                                           contain clues
- clues to thought.


______________________________________________

An aside:

How Computers Work - what’s inside:

Today [2018] computers are boxes that do procedures, not very different from the people called computers in the past.

Procedures can be implemented at different levels. We can use a simple calculator, where we obtain a number but use the number we calculate outside the calculator in some other process. Or we can envision an entire factory as a computer, calculating, measuring and fabricating until a product is finished.

Let’s start with a simple calculator. By last century, mechanical calculators were desktop machines, not much different from Pascal’s calculator from the 1600s. The SCM calculator I used to keep books for my father’s machine shop performed addition, subtraction, multiplication and division on integer numbers. Sometimes it jumped up and down as it moved its carriage back and forth to perform carries from one column to the next.


The first pocket digital calculators came from HP, TI and Sinclair in the 1970s. They had only a few registers to hold numbers, usually the main accumulator, which was displayed and sometimes one or more extra memory registers. All operations were performed by pressing keys on the front panel, usually below the display.


The first computers were very similar - only a few registers and operations. The difference was the operations were read out of memory instead of panel buttons, memory which was now large enough to hold several numbers and several instructions as well. Most computers today are of this type, named after John Von Neumann. Here is an early computer program on a plug board:



By the time we got to the 1960s, a small minicomputer had 4096 words of memory, 12 bits wide. Soon they had 16 or 32 bit words and 64K or more of memory and could do a hundred different operations.  Here is the control panel of a DEC PDP11:


As technology changed, the resources in computers increased greatly and larger programs could be executed. People gathered and shared collections of useful programs - some read  cards or tepe, some printed output, and some were shared math routines. A complete computer “run” consisted in loading all the parts needed together. This was so clumsy that people began to retain their setup from one run to the next. The part that was retained inside the computer came to be called the “operating system.” Operating systems became the OS360, MVS, UNIX and Windows of today.


Operating systems also became more like we think. 

Now we are getting to the good stuff !!!


______________________________________________

NOW, about how we think and compute.  

When I think, I use something I call my attention: …

I seem to be able to direct my attention to only one thing at a time and have to switch it back and forth between things of interest. Attention seems closely coupled to what we call consciousness. Meanwhile, I am somewhat aware of other things going on around me. If I hear a loud noise, for example, my attention is immediately directed to find the source of the sound. So although I can only atttend to one thing at a time, my mind is obviously doing many things at once. We call this multiprocessing. 

Our simple calculators and early computers had only one processor in them. … … … 


   … more coming … 2018-03-16 12:02:10

______________________________________________

______________________________________________

testing: 


© Gareth Harris 2018        -         Contact email: GarethHarris@mac.com         -         see also: SentimentalStargazer.com