Thursday, August 16, 2018

A Compact History of Computers

For a long time I have thought of myself as a computer guy.  I took my first computer class in 1966 and immediately fell in love.  I then spent more than forty years closely involved with them, first at school and then at work.  I write blog posts about what interests me.  So you would think I would have written a lot about computers.  But I have actually written less than you might think.  And it turns out I have never directly addressed this subject.

Here's a complete list of my computer and computer adjacent posts.  (Normally I would include a link for each but since I have so many I am just going to list the publication date.  You can easily locate all of them by using the handy "Blog Archive" at the right side of each post because it is organized by date.)  So here's the list:  12/21/2010 - Net Neutrality; 2/19/2011 - Artificial Intelligence; 7/30/2013 - Home Networking; 1/24/2014 - Windows 8.1 - try 1; 9/16/2015 - Hard Disks; a 7 part series running from 9/24/2015 to 9/30/2015 on the Internet and networking; and 5/19/2018 - Computer Chips 101.  And I have purposely left one out.  It is my first post on the subject, and the one that is most closely aligned with this subject.  On 10/26/2010 I posted "Computers I have Known".  So that's the list.  Now to the subject at hand.

Most accounts of the history of computers credit a machine called ENIAC as the first computer.  There used to be some controversy about this but it has mostly died down.  But I think it is the correct choice.  (I'll tell you why shortly.)  But before I spend time on ENIAC let me devote a very few words to prehistory.

Perhaps the first digital computational device was the abacus and it did influence computer design.  Then a fellow named Charles Babbage designed two very interesting devices, the Difference Engine (1822) and the Analytical Engine (1837).  He never came close to getting either to work but the Analytical Engine included many of the features we now associate with computers.  But, since he was a failure, he and his work quickly fell into obscurity and had no impact on the modern history of computers.  He was eventually rediscovered after computers had been around a while and people went rooting around to see what they could find on the subject.

In the early twentieth century various mechanical calculating devices were developed and saw widespread use.  These gave some hint of what could be done but otherwise had no influence on later developments.  In the years immediately preceding the construction of ENIAC several interesting devices were built.  The Harvard Mark I is given pride of place by some.  The World War II code breaking effort at Bletchley Park in the United Kingdom spawned the creation of a number of "Colossus" machines.  But they were highly classified and so no one who worked on ENIAC or other early computers knew anything about their design or construction.  So where did ENIAC come from?

It arose out of World War II work but not cryptography.  Artillery field pieces came in many designs.  In order for shells to land where you wanted to they had to be aimed.  To do this a "firing table" had to be developed for each make and model.  If you want this type of shell to land this many yards away then you need to set the "elevation" of the gun to this many degrees.  Once you had fired the gun with a few settings mathematics could be used to "interpolate" the intermediate values.  But with the many makes and models of guns that saw use and with the other variables involved a lot of mathematical computations were necessary.  The US Army literally couldn't find enough "computers", people (usually women) who could and did perform the necessary mathematical computations, to keep up with the work.

Two University of Pennsylvania Electrical Engineering professors named Eckert and Mauchly set out to solve this problem by building a machine to crank these calculations out quickly and accurately.  They lost the race in that ENIAC was not ready soon enough before the end of the War to do what it was designed to, crank out firing tables for the Army.  But in the end that didn't matter.  People found lots of other uses for it.  One of the first tasks it completed was a set of computations used in the design of the first Atomic Bombs.

ENIAC was constructed as a set of semi-independent functional units.  There were units for mathematical functions like addition, multiplication, division, and square root.  There were "accumulator" units that could remember a number for a short period of time.  There were units that could be used to feed lists of numbers into the machine or to print results out.  And so on.  And the machine was not programmed in the modern sense.  To perform a calculation you literally wired the output of one unit into the input of another.  Only simple computations, those necessary for the calculation of firing tables, were even possible.

So the first step was to structure the problem so that it was within the capability of the machine.  Then a plan for appropriately wiring the functional units together in order to perform the necessary computations was developed.  Then the functional units were wired together using hundreds, perhaps thousands, of "patch" cables, all according to the specific plan for the current computation.  Then the whole thing was fired up.

It might take a couple of days to design the calculation, a day to wire up the ENIAC, and several hours to repetitively perform the same calculation over and over, feeding a few different numbers in to each cycle, so that each cycle calculated, for instance, all the numbers to complete one line of the firing table for a specific gun.  ENIAC was able to perform computations at a much faster rate than computers (i.e. people) could.  That was amazingly fast at the time but glacially slow compared to modern machines.  But it was a start.

And if this doesn't sound like what you think of when you imagine a computer, you are right.  ENIAC was missing several different kinds of functional units we now expect to find in even the simplest modern computer.  But it rightly deserves its place as "the first computer" because the designs for all the modern devices we now call computers descended directly from ENIAC.

ENIAC was missing three kinds of functional units now deemed essential.  The first one is the simplest, the "bus".  A bus is an Electrical Engineering term that far predates ENIAC.  The idea is that you have a bar, wire, or set of wires, that connect multiple units together.  All the units share the same bus.  But a bus design allows you use the bus to connect any functional unit to any other functional unit.  With ENIAC a serial design was used instead.  The approximately forty functional units were laid out side by side (the size of the room dictated that they were actually laid out in the form of a large "U") and only functional units that were close to each other could be connected together.

Later computers had a bus (and often several busses) incorporated into their designs.  This allowed much more flexibility in which functional units could be connected together.  There is a disadvantage to this design idea.  If two functional units are using the bus all other functional units must be disconnected from it.  At any single point in time all but two units are completely cut off from communication.

With the ENIAC design many pairs of functional units could be connected together at the same time.  They always stayed in communication.  But it turned out the flexibility and simplicity of the bus was more advantageous than disadvantageous.  (And designs incorporating multiple buses allow multiple parallel connections, at least in some cases.)  Switching to a bus design from the serial design was an easy change to pull off.

The second type of functional unit ENIAC lacked was memory.  ENIAC did incorporate a small number of "accumulators" but these could only be used to store the intermediate results of a longer, more complex computation.  They couldn't be used for anything else and they were very expensive to build.  Computer designers recognized that memory, lots of memory, was a good idea.  But it took them a long time to find designs that worked.  At first, various "one off" approaches were tired.  Then the "mercury delay line" was invented.

A speaker pushed pulses of sound into one end of a tube filled with Mercury.  A microphone at the other end picked up each pulse after it had traveled the length of the tube.  And, since under these circumstances the speed of sound is a constant, it took a predictable amount of time for a specific pulse to travel from speaker to microphone.  The more pulses you wanted to store at the same time the slower things went.  You had to wait for all the other pulses to cycle through before you could pick off the pulse you wanted.  If this design sounds like it reeks of desperation, that's because it did.  But it was the memory technology used by Univac (see below) computers.

After a lot of work Mercury Delay Lines were supplanted by "Ferrite Core" memories.  Little magnetic donuts with wires strung through their centers formed the basis of these devices.  By cleverly strobing high power signals through the correct wires a single specific bit could be "set" or "reset".  By cleverly strobing low power signals a single specific bit could be "read".  This technology was faster and it was "random access".  Any individual bit could be read or written at any time.  But it was slow and expensive compared to the modern solution.  The memory problem was only solved when integrated circuit based memory modules were developed.  They allowed large (gigabyte) fast (gigahertz) cheap (less than $100) memories.

But computers with a small (by current standards) but large (by ENIAC standards) amounts of memory were developed within a few years.  That left the logic unit, sometimes called the sequencer.  Functional units were physically connected together using patch cables in ENIAC.  This involved a slow and error prone process.  If the design was changed to incorporate a bus and if each input interface and output interface of each functional unit was connected to the bus then anything could be connected to anything.  But, as I indicated above, only two at a time.

The logic unit sequentially decided to connect this pair to the bus then that pair to the bus, and so on.  This permitted complete flexibility (within the limits of the hardware) in terms of how the functional units were connected together.  Initially this resulted in a slower machine.  But the increased flexibility got rid of all the rewiring time and greatly reduced the planning time.  And it permitted faster simpler designs to be used for the functional units.  In the end this simpler design resulted in faster machines.

And, as the amount of memory available grew, it was quickly determined that the wiring instructions could be stored in memory as a "program".  This required a more complex sequencer as it had to be able to decode each instruction.  But it again speeded up the process of going from problem to results.  It only took years for all these pieces to be designed, built, and put to good use.  And the reason for this is one of the prime motivators for this post.

Once the ENIAC was built a lot of the details of its design became widely known almost immediately.  This let people focus on making one aspect of the design better.  They could just plug in the ENIAC design for the rest of their machine.  ENIAC was a revolution.  These other machines were an evolution.  And evolution can move very quickly.

The same thing happened when the Wright Brothers flew the first complete airplane in 1903.  As an example, there was a guy named Curtis who was a whiz with engines.  The engine in the Wright plane wasn't that good.  But Curtis could basically take the Wright design and plug his much better engine into it.  So he did.  This resulted in a lot of bad blood and law suits but, for the purposes of this discussion, that's beside the point.

The airplane evolved very quickly once a basic design was out there as a foundation to build on.  World War I saw the airplane evolve at warp speed.  Better engines, better wings, better propellers, better everything, were quickly found and incorporated.  The airplane of 1919 bore only a faint resemblance to the airplane of 1914.  And this was possible because different people could come up with different ideas for improving one or another facet of the overall design and then plug them into an existing design.

The same thing happened with computers.  Pretty much every part of ENIAC needed radical improvement.  But, as with airplanes, an improvement in one area could be plugged into an already existing design.  By 1951 everything was in place.  And that allowed the introduction of the first mass production computer, the Univac I.  Before Univac each computer was hand built from a unique design.  But several substantially identical Univac I machines were built.

At this point "peripheral devices" started to proliferate.  The Univac relied primarily on spools of magnetic tape mounted on tape drives.  The drive could under programmatic control speed to a particular place and read or write a relatively large amount of data relatively quickly.  Over time other types of devices were added to the peripheral pool.  And for comparison, the Univac featured 1000 "words" or memory, each big enough to hold a 12 digit number.  And, as with all subsequent designs, both programs and data were sored side by side in this memory.

Univacs were quite expensive and less than 50 were every built.  But they demonstrated the existence of a market.  Other companies quickly jumped in,  The most successful was IBM.  IBM pioneered a number of technical innovations.  They were among the first to hook a "disk drive" to a computer, for instance.  But IBM was the first company to successfully crack the marketing problem.  They were the best at selling computers.

It may seem obvious in retrospect but computers of this era were very expensive.  Soon a lot of companies came to believe that if they didn't get one they would be left in the dust by their competitors.  But the two industries where computers could obviously do the most good were banks and insurance companies.

Both needed to perform vast numbers of boring and repetitive computations.  And that was just what best fit the capabilities of early computers.  Not to put too fine a point on it, but neither banks nor insurance companies employ large numbers of rocket scientists or other "tech savvy" people.  The people who actually ran these companies, senior executives and members of the board of directors, were scared stiff of computers.

IBM set out to bypass all the people in these companies that would be actually responsible for the installation and operation of computers and instead went directly to these senior executives.  They didn't bother to tout the specifications or capabilities of IBM products.  They knew these people were not capable of understanding them nor did they much care.  What concerned them was "betting the company".

They were afraid that they would end up spending a ton of money on a computer.  Then something terrible would happen involving that computer and the company would go down in flames, all because of something that was beyond the understanding of senior management.  What IBM told these senior executives was "if you buy IBM we will take care of you.  If something goes wrong we will swoop in and fix whatever it is.  Oh, it might cost more money than you had planned on spending, but we promise you that if you go with IBM you will not be putting your company's very existence (and by implication the livelihood of these senior executives) in jeopardy".

And it worked.  In case after case the lower level people would, for instance, say "we recommend GE" or "we recommend RCA".  At the time both GE and RCA were as large or larger companies than IBM.  And both had well established reputations for their technological prowess.  But none of the other companies (and there were several besides GE and RCA) aimed their sales pitches so squarely at senior management.  And in case after case the word came down from on high to "buy IBM anyhow".

And companies did and by the late '60s 80 cents of every computer dollar was going to IBM.  It wasn't that their hardware was better.  It was better in some ways and worse in some ways than the equipment offered by other companies.  But it was good enough.  A saying from the period had it that "no one ever got fired for recommending IBM".   That was true.  And the converse was also true.  People sometimes got fired or got their carriers sidetracked for recommending a brand other than IBM.

It took a long time for the computer industry to recover from the total dominance that IBM held for more than a decade.  But there was one technical innovation that was rolled out by IBM and others at this time that is important to talk about.  That's microcode.

The logic unit/sequencer was by far the most complex and difficult part of a computer to design and build.  It had to take the bit pattern that represented an instruction, break it down into steps of "connect these components to the bus, now connect those components to the bus, now connect these other components to the bus", etc.  It turned out that there were a lot of considerations that went into selecting the correct sequence.  And that made this particular component extremely hard to design and build.  Then somebody (actually several somebodies) had an idea.

What had made the original ENIAC so hard to deal with?  The fact that it had to have customized hand wiring done to it each time a new problem was put to it.  Well, the spaghetti that the sequencer had to deal with seemed similarly complicated.  And if you got something wrong the computer would spit out the wrong answer.  In the ENIAC case you just stopped it, fixed the wiring, and ran the calculation over again.  But once the computer was built there was no even remotely easy way to fix problems with the sequencer.

So several people at about the same time said "why don't we create a computer to run the computer?"  It will run a single special program called "microcode".  If there is a problem and we can change the microcode then we can fix the problem.  And that meant that the sequencer hardware became much simpler.  A lot of the complexity could be exported to the design and coding of the "microcode" program.  And the microcode for a new computer could be emulated on an old computer.  So it could be extensively tested before anything was committed to hardware.

This too sounded like it would slow down things immensely.  But it did not.  The simpler sequencer hardware could be optimized to run much faster than the old more complicated design.  And other tricks were found to make the whole thing go fast in just the same way that replacing the patch cable wiring of ENIAC with a bus and memory resident programs eventually resulted in an increase in computer speed.  By the end of the '60s pretty much all computers used microcode.

Later, ways were found to house the microcode in hardware that allowed it to be updated on the fly.  This meant that microcode fixes could be rolled out well after the hardware was originally manufactured.  Some computer designs have evolved to the point where there are two levels of microcode.  There is the, call it pico-code, that allows the hardware to run multiple versions of microcode that, in turn, implements what appears to be the computer.  This three level architecture is the exception rather than the rule.

The next thing I want to talk about is Microsoft.  Bill Gates was pretty much the first person to figure out that the money was in the software, not the hardware.  When IBM rolled out it's "System 360" family of computers in the mid '60s it laterally gave away the software.  Their thinking was that the value was in the hardware.  And most computer companies followed IBM's lead.  Hardware was the high value profit-maker and software was a loss leader afterthought that you threw in because it was necessary.  Gates was the first person to focus on the word "necessary".

Microsoft was a software company from the get go.  Their first product was a BASIC interpreter for the first generation of personal computers.  At the time you were expected to buy a kit of parts and assemble it yourself.  But almost immediately it became obvious that people were willing to pay extra for a pre-assembled computer that they could just fire up and use.  Either way, however, they still needed Microsoft's BASIC.

Microsoft branched out to other software products, most famously MS-DOS and later Windows.  And they do sell a small line of hardware, keyboards, mice, the odd tablet, etc.  But, unlike Apple, Microsoft has not seriously (some would say successfully) gotten into the hardware business in a big way.  Once PCs took off in a big way Microsoft took off in a big way.  And many other companies have successfully followed this model.  Even Apple has outsourced the manufacture of its extensive line of hardware.  They still do, however, do their own hardware design work.  And they never outsourced their software work.

And even "it's the hardware, stupid" types like IBM have been forced to follow suit.  They were initially forced by anti-trust litigation to start selling their "System 360" software.  From this modest start they have continued to evolve away from hardware to the point where they are now almost entirely a services company.  Over the years they have sold off or shut down most but not quite all of their once very extensive hardware business.  So they do still sell some hardware but it now represents a very small part of their total revenue.

I now want to turn to a product that has been pretty much forgotten about.  A man named Phillipe Kahn started a company called Borland at about the time the first IBM PC was released in 1981.  In 1984 he released a product called Turbo Pascal.  You could buy the basic version for $49.95 or the deluxe version for $99.95.  It was organized around a once popular computer language that has pretty much fallen out of favor called Pascal.  I am not going to get into the differences between Pascal and the well known "C" programming language.  One is better than the other in this or that area but, over all, they are actually pretty similar.   So what did you get for your $49.95 (the version most people bought)?

You got an "integrated development" package.  You could use it to write or modify a Pascal program.  You could then literally push a button and your program would be compiled (turned from a vaguely English-like thing that people could deal with into programming instructions that the computer could deal with).  And the Pascal compiler was lightning fast, even on the PC of this era.  (The process typically took only a few seconds.)  Then (assuming compiler had come across no obvious errors in your program) you could push another button and run your program.

If errors were found by the compiler you were automatically popped back into the "edit" environment.  You could make changes and then immediately recompile your program.  And the package offered similar options for fixing your program after it had compiled cleanly.  If your program seemed to be misbehaving you could run it in a special "debug" mode.  This allowed you to step by step work your way through the execution of your program a line at a time.  You could even examine the current value of variables you had defined for your program to work with.

Once you had seen enough you could pop back to "edit" mode, make modifications, and go through the whole compile/execute/debug cycle over and over as many times as needed to get your program working the way you wanted it to.  Then you could sell your program.  And you cold sell just the executable version that did not disclose the original Pascal "source code" of your program.

With Turbo Pascal and a PC you could go from edit to compile to debug and back to edit within minutes.  This had a profound impact on computer software.  ENIAC required smart, skilled, highly trained people to operate it.  Univac made things easier but it was still very hard.  The IBM 360 made things still easier but the cost and skill level was still very high.  And a single edit/compile/execute/debug cycle could often take all day on any of these machines.

Then there was the snobbery.  The bank I worked for in the late '60s required all of their computer programmers to have a 4 year college degree.  It was presumed that only very smart people (i. e. college educated) were up to the task.  But with turbo Pascal a whole crop of housewives, clerks, blue collar workers, and kids were able to master Turbo Pascal and create interesting, useful, and most importantly, valuable computer programs.

It completely democratized the whole software development process.  It turns out that the only attributes a person needed to become successful in the computer business was a knack for computers, a little training (the documentation that came with the Turbo Pascal package consisted primarily of a single not very big book), and access to now quite inexpensive and ubiquitous home computer equipment.  Not everybody is cut out to be a computer expert but a surprising number of people can master the subject.

And that's about where I would like to leave it off.  Pretty much everything that has happened since is the extension of a trend or movement started during the time period I have covered.  Computers have now gotten faster, more powerful, lighter, more compact, and more portable.  But that's just more of the same.

The hardware has gone from vacuum tubes (essentially light bulbs with extra wires in them) to discrete transistors to integrated circuits (the ubiquitous "chip") but integrated circuits were in wide use before 1980.  Even the Internet is an extension of and an enhancement to the ARPANET, a project that was begun in the late '60s.  And it turns out that people had been connecting computers to networks since well before ARPANET.

I would like to leave you with one last item, well more of a musing.  Since the early days computer components have been divided between hardware and software.  The idea is that the actual parts used to assemble a computer from are hard or, more importantly, hard to change.  Computer programs on the other hand are soft.  They are malleable and easy to change.  But it turns out that actually the opposite is true.  Hardware is easy to change and software is hard to change.

IBM pioneered the idea of an "architecture" in the early '60s when they designed the System 360 family of computers.  Before this every time you upgraded to a new computer you had to redo all the software.  It was presumed that this would not be a difficult or expensive process.  But over time it turned out to become more and more difficult and more and more expensive.

With that in mind IBM designed a family of machines that would all be capable of running the same programs.  They specified an "architecture" that all the machines would adhere to.  The usual reason people replaced computers was because, in the words of an old TV show, they needed "more power".  With the System 360 you could just replace your smaller, less powerful (and less expensive) computer with a bigger one that had more power.  IBM guaranteed you didn't have to change a thing.  All the old software would run just fine on the new hardware.  It would just run faster.

IBM spent a tremendous amount of effort on making sure that the "360 architecture" was implemented uniformly and consistently across all machines in the family.  One of their best people, a guy named Iverson, applied the computer language he had just invented (APL, if you care) to creating models of key components of the architecture that were accurate down to the bit level.  And it worked.

A few years later IBM came out with an upgrade called the "System 370" that was carefully designed to be "backward compatible" with the 360 architecture.  The new line offered additional features but things were carefully arranged so that the old programs would work just fine on the new machines.  So companies were able to easily upgrade to the new machines that, of course, featured more power, without a hitch.

And this became the model for the industry.  The hardware decedents of the System 360 no longer exist.  But software written to the 360 architecture standard (and often quite a long time ago) are still running.  I know because as I am going about my daily business I see all kinds of companies running what I can positively identify as 360 architecture software.  This is made possible by microcode.

Microcode makes it possible for hardware to behave in completely unnatural ways.  The hardware that now runs these 360 architecture programs is the many times descendent of something called a System 38.  The original System 38 bore no resemblance to the grandchildren of the system 360 machines that were in existence at the same time.  But that no longer matters.

In fact, hardware has come a long way since the '60s.  But thanks to microcode the newest hardware can made to faithfully implement the 360 architecture rules so that the old programs still run and still behave just as their programmers intended them to.  And this is in spite of the fact that the hardware that is doing this descended from hardware that was completely incompatible with System 360 family hardware.

Intel developed the first versions of the X-86 architecture in about 1980.  The modern computer chips Intel now sells bear almost no physical resemblance to those chips of yesteryear.  Yet X-86 software still runs on them and runs correctly.  Apple started out using a particular computer chip made by Motorola.  They later changed to a newer, more powerful, and quite different Motorola chip.  Yet they managed to keep the old Apple software running.  Then they made an even more drastic change.  They changed from the Motorola chip to an Intel X-86 family chip.  But they still managed to keep that old software running and running correctly.

It turns out that any individual program, i.e. "piece of software", is fairly easy to change.  But families and suites of software quickly arose.  Many examples of this could already be found when IBM was sitting down to design the System 360 computers.  And these families and suites turned out to be very hard to change.  They behaved in ways more akin to the ways people associated with hardware.  On the other hand, people got very good at making one kind of hardware "emulate", i.e. behave exactly the same as, another quite different kind of hardware.  So hardware started behaving ways more akin to the ways  people associated with software.

This "hardware is soft and software is hard" thing has been going on now for at least a half a century.  But people got used to the old terminology so we still use it.  Is black now white?  Is up now down?  Apparently not.  But apparently Orange is the new Black.

No comments:

Post a Comment