Tuesday, October 26, 2010

Computers I have Known

I wrote my first computer program in 1966.  But that's not the start of computers.  And I'm a pedantic kind of guy so let's begin at the beginning.  The word "computer" has been around for a long time. But for most of that time a "computer" was a person who performed computations.  I'm going to stick with the machine type of Computer.  The first computer was a machine called ENIAC.  It was begun during World War II and finished after the war ended.  It was built by John Mauchly and Prosper Eckert, two professors at the Moore School of Engineering at the University of Pennsylvania.  It did not really resemble a modern computer.  It had no RAM and you couldn't program it in the modern sense.  But it was the foundation of all modern computers.  Others quickly took the ENIAC design and added various memory systems that eventually evolved into RAM.  The also added programming ability and all the accouterments that we associate with the modern computer.   A nice book on ENIAC is "ENIAC" by Scott McCarthy.

ENIAC was a custom "one off" machine.  It was followed by several other custom "one off" machines.  But by the mid nineteen fifties recognizably modern computers existed.  They were programmable, had memory systems, and had peripherals like printers, keyboards, display units, etc.  There were also production runs of a number of substantially identical machines.  The first production computer was the Univac system.  All these early computers were built with vacuum tubes, commonly between 10,000 and 20,000 vacuum tubes per machine.  Anyone who has dealt with large numbers of incandescent light bulbs knows that they fail regularly.  It was virtually impossible to keep these machines running for an extended period of time because a tube here and a tube there would fail.  They also put out terrific amounts of heat.   This meant you needed a lot of electric power to run them and then a lot of air conditioning to keep them from burning up.

In the late fifties there was a big concern that the Ruskies were going to send a bunch of bombers over the North Pole equipped with nuclear weapons.  As a response to this the U.S. Government built something called the DEW (Distant Early Warning) line.  They decided to equip it with computers.  Also, by this time electronic circuits made with transistors (an early form of Integrated Circuit) had just become available.  Transistors were much smaller and used much less power than vacuum tubes.  The Government specified that DEW line computers be transistorized.  IBM won the bid.  But they did not have a transistorized computer available immediately.  So what they told the government (and eventually did) was that they would deliver tube computers but would replace them with compatible transistor computers within two years.  The tube computers they delivered were models 704 and 709.  The transistorized versions were models 7040 and 7090.  And that's where I come in.

The first computer I ever wrote code for was an IBM 7094, a souped up version of the 7090.  The 7094 had 32,768 "words" of 36 bit memory.  A character of text was represented in 6 bits so you could put 6 characters in a word.  This amount of memory was equivalent to about 150K (yes!  thousands of bytes) of RAM.  6 bits allowed for only 64 different characters.  So all that was available were the digits (10 characters) the upper case letters (another 26 for a total so far of 36 characters) and an assortment of punctuation.  This machine cost millions of dollars and had far less computing power than a modern dumb cell phone.

The second computer I wrote code for was a Burroughs B5500.  While IBM is still in the computer business, Burroughs is not.  In the 1960's the computer industry was characterized as IBM and the seven dwarfs.  Burroughs was one of the seven dwarfs.  Univac (see above), another dwarf, merged with the Sperry Company (a defense contractor) to become Sperry Univac.  That company in turn merged with Burroughs to become Unisys.  Unisys still exists but primarily as a Defense Contractor.  The B5500 was a very innovative computer.  It too had 32,768 (2 raised to the 15th power) words of memory.  But, if I remember correctly, each word was 48 bits in length so it had more memory than the 7094.

My third computer was a CDC (Control Data Corporation) 6400.  CDC was another of the seven dwarfs.  Various parts of what was CDC still exist as parts of other companies.  But CDC stopped making computers in about 1990.  The CDC 6400 was a slowed down version of the CDC 6600, which was designed by Seymour Cray.  The 6600 was extremely fast for its time and relatively cheap.  Cray was a genius and eventually became known as "Mr. Supercomputer".  The 6400 was the only Cray design I ever worked with.  But he went on to design the CDC 7600 and other 7000 series computers.  Shortly thereafter he went his own way founding Cray Computer.  He designed and built the Cray 1, far ahead of its time, and various follow on machines.  A good book on Cray and Supercomputers is "The Supermen" by Charles J. Murray.  Cray went bankrupt after Seymour's death and the name was eventually purchased by another supercomputer maker (Tera Computer Company), which now does business as Cray.

The 6400 had an even bigger word size, 60 bits, but still used 6 bits to represent a character.  Cray was a genius at designing fast floating point units.  His designs were generally floating point units with the minimum amount of additional hardware necessary to feed data to the floating point units.  The 6000 series (6400, 6600, and a "dual processor" 6500) instruction set was extremely simple and orthogonal (a mathematical term).  This made it very easy to learn how to program them in machine or "assembly" language.  It was my contention for many years that if you knew the assembly language for any computer I could teach you everything there was to know about 6000 series assembly language in less than a single 8 hour day.

The 6000 series and later Cray designs were peculiar in a number of ways in their representation of numbers.  A floating point number occupied a 60 bit word.  Like standard "scientific" notation" there is a "mantissa" part that tells you the value of the nonzero digits and an "exponent" that tells you where the decimal point goes.  The 6000 series computers. like other computers, were binary rather than decimal, but the idea is the same.  What made the 6000 series computers peculiar for their time was they had a representation for infinity.  They also had a representation for unknown (literally the value of this number is unknown, it has no particular value) and two representations of zero (a plus zero and a minus zero).  Using either the plus zero or the minus zero in a computation would give the same result.  If the result of a computation was zero the hardware would always generate a plus zero.  But you could also perform computations on infinity and unknown.  For instance, infinity plus infinity would yield infinity.  Infinity minus infinity would yield unknown.  Unknown combined with almost anything would yield unknown.  You could also specify if you wanted to use affine or projective infinity (you don't want to know).

The fourth computer I wrote code for was the SDS (later XEROX - same computer, they just changed the name) Sigma-5.  This was the first computer I used that had 8 bit bytes and 32 bit words.  8 bits permits 256 different symbols to be represented.  This leaves room for upper case letters, lower case letters, digits, punctuation, and lots of other stuff.  Also 8 is a power of 2 (2 raised to the third power), which is more convenient to hardware people than 6, which is not a power of two.  A 32 bit floating point number is rather small, you only get about 6 significant digit accuracy.  This is fine for a lot of calculations but inadequate where more precision is required.  Let's say you are adding up numbers that run to millions of dollars.  A million is 7 digits and if you need the cents you need nine digits.  So it was possible to do "double precision" floating point computations, giving you about 12 digits of accuracy spread across two 32 bit words.  Of course, the CDC 6400 could do double precision computations using 120 digits, giving you access to astounding accuracy.

The Sigma-5 was a good bridge to my next computer, and IBM 360 model 40.  Previously all the computers I had worked on were "scientific" computers.  They were good at engineering and scientific calculations, hence the floating point (e.g. scientific number) stuff.  The 360/40 was supposed to be a "general purpose" computer.  It was supposed to be able to do the scientific stuff but it also had good support for "business" computations.  Businesses needed to get long columns of figures to add up accurately.  The numbers themselves were known with complete accuracy.  (A checking account balance off by even a penny can make a customer very angry).   But while "business" computers can handle numbers with complete accuracy they can't handle small fractions like millionths or really large numbers (e.g. the number of miles in a light year).  And in engineering and science numbers are usually known to an approximate value (the plane is going at about 500 MPH plus or minus 10 MPH or the sample weighs 536 plus or minus 2 grams).

This 360/40 belonged to a Bank.  Since the Bank wanted the computer to keep the books of itself and its customers, it needed to do accounting type calculations on dollars and cents so they didn't even purchase the floating point unit, which was optional on the 360/40.  The 360/40 had 8 bit bytes and 32 bit words like the Sigma-5.  But it also had "packed decimal" arithmetic that was ideal for doing accounting calculations.  And it had 128KB (thousands of bytes) of "ferrite core" memory (memory made of many tiny magnetic donuts with wires strung through them).  When the IBM 360 line came out, memory pricing was very simple.  It was $1 per byte.  And, if you didn't know how much memory to get, IBM recommended buying an amount that matched the price of the computer itself.  So, if you bought a million dollar computer, you should buy a million bytes of memory for about a million additional dollars.

While, the 360/40 was my first "IBM 360" family computer, it was far from my last.  I worked on a 370/155, a 3032, a 3033, a 3083, a 3081, and possibly others.  All these machines had a common architecture, so that programs designed for the older machines would run on the newer ones.  And these were all million dollar machines.  And they were physically big.  The 3033, for instance, consisted of several boxes.  One of them had a length of 32 feet.  They were all five feet high and the whole set took up a 35' by 35' square of floor space.  And there were many cables snaking between the various boxes under the floor.  So you had to have a false floor that was raised about 12" above the real floor so there was room to snake the cables around.  The 3033 had 16 MB (finally we graduate from thousands to millions) of RAM, and by now computers used RAM.  It was rated at about 3.2 MIPS (million instructions per second).  This would put its speed at somewhere between an Intel 386 and a 486.  It's a computer but it is now as powerful as a normal cell phone.  I spent a lot of years working on IBM "mainframe" computers.

My next computer was an IBM PC, almost the original model.  It has an 8088 processor and 128KB of RAM.  The 8088 had a 16 bit internal architecture but the external bus was only 8 bits wide.  Over the years I have worked my way up through the Intel line (with a few AMDs mixed in).  I went from the 8088 to a 386 to a 486 to a Pentium to whatever they are called now.  I currently own a couple of PCs.  The fanciest one has a quad core Intel processor and supports 64 bit architecture.  I have gone through DOS 1.1, Dos 2.0, DOS 3.1, 3.2, and 3.3.  I skipped DOS 4 and did DOS 5.  I then did DOS 6.0, 6.1, 6.2, and 6.22.  I have done Windows 3.0, 3.1 WFW, Windows 95, OSR2, 98, 98SE2, and Millennium (I only beta tested Millennium then I dumped it).  I have done NT (in both desktop and server flavors depending on the release) 3.51, 4.0, Windows 2000, and Windows XP.  I am currently running Windows Server 2008 on one box.  I run XP on my main desktop and have test driven the desktop version of Windows 7.  I am planning to get a new desktop box with Windows 7 on it in a few months.  I have also run various versions of Redhat Linux.

So that's most of the computers I have known.

No comments:

Post a Comment