Monday, September 3, 2018

Jury Duty

This subject was suggested to me by the end of the first Manafort trial.  There is supposed to be at least a second trial and the possibility of a third trial can not be ruled out at this time.  Nor may various appeals, pardons, etc.  So there is lots of process left.  But I want to focus on one aspect of one part of the process.

I have a line I have trotted out a number of times recently.  It goes like this:

I have never been married and never had children.  That's why I am an expert on child rearing.
The only reasonable conclusion to be drawn from this is that, in fact, I am definitely NOT an expert on child rearing.  Instead, whatever I have to say on the subject should be taken not with a single grain but with a five pound bag of salt.  I usually follow this statement with one that goes:

If you don't find your kids a frequent source of entertainment then for you having kids was a mistake.

I am careful to trot this out only when both children and their parents seem to be having a good time.  Raising kids is both very expensive and very time consuming.  I am not going to go into all the benefits of having kids.  Others who are far better qualified, and that means pretty much everybody, have done a far better job of that than I ever could.  But I do think it helps if parents recognize that along with the usual list of benefits there is also the entertainment factor.  And so on to the subject at hand.

Since it is also true that I have also never studied the subject, my credentials for opining on child rearing are exactly zero.  So what are my credentials with respect to the subject at hand?  They are better but still don't rise to the level of adequate.  Thanks to movies and TV I have an education in the law.  But it is a quite misleading one.  What I can say in my favor is that I have been called several times for jury duty, been "voir dire"ed a few times, and once served on a jury.  So with these marginal credentials in mind, here goes.

So what's a jury's job?  What is it supposed to do?  Well, in various entertainments it sits around and eventually renders a verdict.  And we are led to believe that everybody knows everything there is to know about how that verdict should be reached.  The entertainment leads us, the audience, to the conclusion that the defendant is either unambiguously guilty or unambiguously innocent.  But actual trials are messy things.  Fortunately, the courts devotes significant effort to help prospective jurors understand how to reach a verdict.

In actual trials a considerable amount of time and effort is spent educating potential jury members (before they are selected) and actual jury members (after they are selected) in what they are supposed to do and not do.  Some of this is very helpful.  Some of this is the opposite of helpful.  Let's start with the former.

"Juries are finders of fact and judges are finders of law", jurors are told.  So what does that mean?  Well, it depends on what kind of case is involved.  Cases fall into two general categories:  civil and criminal.  A civil dispute is between two parties and does not directly involve the government.  Most of the time a contract is at the heart of the dispute.  Two parties have entered into a contract and there is a dispute that arises out of this.

For example, one party may sue claiming he was not paid.  The other party may counter-claim that the first party did not do the work he said he would.  How does this all get sorted out?  The two parties go to court and a decision is rendered.  But civil cases seldom generate much interest among the press or public.  So, like the press, I am going to focus on criminal cases.

Here the case is "the state versus X".  Here the "state" may actually be a city, county, or other governmental entity.  Or it may actually be the "state" in the form of the Federal Government.  And, of course, the "X" is usually an individual but it may instead be a group, corporation ("remember, corporations are people", Presidential candidate Mitt Romney famously (and correctly, at least when it comes to the law) informed us).  As an example, the case I referenced above was formally "U. S. versus Manafort".

The other component is that the defendant is accused of violating one or more statutes or regulations.  In Federal cases you will hear "USC yak-yak-yak").  Here, "USC" does not refer to the University of Southern California.  It is the "United States Code".  As you might expect the entirety of the USC is now quite large.  It is broken into volumes.  The volumes are broken into sections.  The sections are broken into subsections.  And so on.  So the actual citation of one of the charges in one of Manafort's cases is that he violated 18 U.S.C. 981(a)(1)(C).  This means he is charged with violating the law found at volume 18 section 981 sub-section "c" sub-sub-section "1" sub-sub-sub-section "C" of the United States Code.

If you have access to a law library you can go to the volumes of the United States Code and pull down volume 18 (which might actually run to several physical volumes).  You can then flip to section 981.  There you will find that it is broken into sub-sections labeled "a", "b", etc.  And that sub-section "a" is further broken down into sub-sub-sections "1", "2", etc.  And that sub-sub-section "1" is further broken down into sub-sub-sub-sections "A", "B", etc.  (And in some cases the sub-sub- business may descend to even more levels but you get the idea.)  And the USC consistently uses lower case letters for the sub-levels, numbers for the sub-sub levels and so on.  This makes it easier to keep which layer you are talking about straight.

The USC is the worst in terms of length and complexity.  But state, county, municipal laws and regulations follow the same pattern.  They are tightly organized in a system that allows the compact citation of a single specific law or regulation.  And the defendant must be charged with the violation of specific laws.  He can't be charged with being a bad person.  For instance, if a person is charged with "vagrancy" by the time the case gets to trial "vagrancy" needs to be replaced with a reference to a specific component of a specific law or regulation of the governmental body levying the charge.

So how does this relate to the "finding of fact" business?  So far all we have talked about is "the law".  Well, each law includes a definition of the crime.  The crime has "elements".  The elements are listed for each specific crime.  It is the jury's job to determine if the facts introduced during the trial phase of the case prove that all of the elements have been demonstrated.  If one element is left unproven then it's game over for the prosecution on that specific "count" of the case.  (If a defendant is charged with multiple crimes than each specific crime becomes a "count".  Manafort was charged with 18 counts.)

And there is another element, call it the zeroth element.  That has to do with jurisdiction.  Did the crime occur within the jurisdiction of the law (and court) in which the case is being tried.  If the defendant is charged with violating a specific section of the Municipal Code of the City of Seattle but the crime actually occurred in Redmond, then the defendant must be acquitted.  Similarly, if the trial is taking place in a Tacoma court room, Tacoma courts have no authority to enforce Seattle laws, so again the defendant must be acquitted.

The judge and the attorneys for the various sides are required to go through all this with the jury.  So, for instance, the prosecutor carefully asked a witness in the case I was involved in what seems like a stupid question "did the crime occur in Seattle"?  The "yes" answer fulfilled the legal requirement to show that the issue of jurisdiction had been properly addressed.  And you can bet that this sort of thing never makes it into the parts of trials that show up in entertainments.

So the first responsibility of a juror is to decide whether all the elements of the crime have been proven.  If the answer is "no" to even one of them then that's it.  But if the answer is "yes" to all of the elements then there is more.  Proving all the elements of the crime is a necessary but not sufficient condition for a "guilty" verdict.  There may be mitigating circumstances.  For instance, in a murder case the defendant may claim "self defense".  "Yes, I killed him but I only did it because he tried to kill me first".  Self defense is usually considered sufficiently mitigating to justify acquittal even though all of the elements of the crime have been proven.

But here is where things get sticky.  What constitutes "sufficient" proof.  Or, more generally, what constitutes "reasonable doubt".  In civil cases a different "standard of proof" is used.  The presumption is that the stakes are lower in these types of cases.  Also the sides are presumed to be more evenly matched in terms of potential liability.  It is not the case that one side (the state) is putting little at risk while the other side (the defendant) is putting a lot at risk (i.e. incarceration, death).  So in civil cases a "preponderance of the evidence" standard is used.  This is usually paraphrased as "more evidence in favor or one side's position than for that of the other".

The problem in criminal cases is there is no accepted standard for "beyond a reasonable doubt".  We saw this on display in the Manafort case.  In his instructions to the jury the judge talked about the meaning of "reasonable doubt" but it was all mush.  After the jury started deliberations they sent out a question to the judge asking for clarification on what constituted "reasonable doubt".  He parroted back the same mush he had given them in the first place.  In other words, he left them on their own.  He was forced to do this because of the lack of an accepted standard.  Reasonable people differ.

So let me start by discussing my approach.  The first thing I do is assume everyone is telling the truth and ask myself "is there a set of circumstances that is consistent with what everyone testified to?".  I have found that this actually works a lot better than you would think it would.

I am not naïve.  I know people misremember or outright lie and they do it a lot.  But I find this a very good starting point.  If you make this assumption you will find that there are far fewer inconsistencies and contradictions than you would think there would be.  And a very important scientific principle lies behind the whole "inconsistencies and contradictions" thing.

Scientists believe the real world is "consistent".  Things are only one way.  If one theory predicts "red" and another theory predicts "blue" then there is a problem.  One, or possibly both, of the theories is wrong.  And if a specific theory predicts "red" but observation shows the actual situation to be "blue" then that theory is wrong and something needs to be changed.

This idea of consistency works well in science and also in the real world.  But it is also surprisingly helpful for dealing with sorting out things like who to believe.  In a trial there may be more than one sets of circumstances that are consistent with all the evidence and testimony.  So start there and see how far it gets you.  If nothing else it should result in only a few areas of dispute that need to be resolved on the way to reaching a verdict.

Next there's the issue of what evidence is appropriate to consider and what weight to give it.  For a long time eye witness testimony was considered the best evidence.  "Circumstantial" evidence (everything else) was considered less reliable.  But things like fingerprints and DNA have turned out to be way more reliable than eyewitness testimony.  We are more and more seeing the consensus shift to valuing certain types of circumstantial evidence more highly than eye witness testimony.

And in the Manafort case the prosecution contended that business documents like bank records, printouts of emails and text messages, and other circumstantial evidence was what the jury should primarily depend on.  The defense tried to discredit the prosecution's key witness but did not make an effort to discredit any of the documentary evidence.  The one juror that has spoken out so far said she relied primarily on the documentary evidence and paid much little or no attention to the eye witness testimony.  But there is still a need for eye witness testimony and for jurors to make judgements about the accuracy and reliability of that testimony.

One of the bedrock elements of our society is a "jury of our peers".  What's going on with that?  Well, one of the things that it does NOT have to do with is for jurors to go out and gather evidence on their own.  Jurors are also not supposed to hold a preconceived notion of guilt or innocence.  It is not okay to decide "this guy a stand up guy so he couldn't have possibly done it" or "this guy is scum so he must have done it" based on how the person strikes you.  Jurors are expected to render a verdict based solely on the evidence presented in court.

Now this often presents problems.  In entertainments we often see the crime committed so we know for sure "who done it".  If not, then usually we are given hints sufficient to push us firmly into one camp or the other.  But in actual trials it is often impossible to produce enough evidence to resolve all issues.  Maybe a witness died or disappeared or was never located in the first place.  Maybe the lab work was bungled or was never performed in the first place.

In many jurisdictions, including the one I live in, (but never in entertainments) the backlog at the crime lab is a year or more.  So everybody prioritizes.  Do we really need this blood analyzed to prove our case?  If the answer is "no" then the blood work is not submitted to the lab in the first place.  This leaves the jury to work through an incomplete case.  Not all the questions that they might reasonably ask are answered over the course of the proceedings.

In the Manafort case the trial took place in a "rocket docket" jurisdiction.  The judges want cases move quickly so they push lawyers to reduce the number of issues in dispute to the minimum and to introduce the minimum amount of evidence necessary to decide each issue.  There is good reason for this.  As a result cases in this jurisdiction often come to trial quickly.  That is not the case in many other jurisdictions.  And this business of getting to trial quickly can help the ends of justice.  But it may also mean that the prosecution (or defense, but in this case the defense presented no case) may end up not presenting evidence that could possibly have resolved the doubts of one or another juror.

In this case one juror cited "reasonable doubt" in her decision to vote to acquit on 10 of the 18 counts.  (She voted to convict on the other 8 counts.)  All eleven other jurors voted "convict" on all 18 counts.  It is possible that if the prosecution had been allowed to introduce a little more evidence that evidence could have been enough to resolve her doubt.  But we'll never know.  What we do know is that we now have a mistrial with respect to ten of the original eighteen counts.  And that may result in another trial to resolve the fate of those counts.

Getting back to my process, I start with a position of "everybody's telling the truth".  Even if that fails it reduces the number of issues considerably.  So the next question is what to do if we have a conflict in the testimony.  This is an area where the jury is supposed to apply their judgement.  Who is likely to be telling the truth and who is not.  And is the "not" person merely making a legitimate mistake or are they lying on purpose?

This is where the jury is supposed to apply their life experience rather than any specific fact or experience.  And it may boil down to a question of whose story is more reasonable or believable.  Again, jurors are expected to apply their life experience to come to a judgment.  And this is why a "jury of peers" is important.  A shared life experience is supposed to result in better judgements.

But what to do about the missing parts?  That's a conundrum and one for which there is no agreed on solution.  The appropriate response to missing parts we are told is "no response".  What does that mean?  Well, I tend to go with the Wikipedia language of "neutral interpretation".  Try to deal with it in a way that favors neither the prosecution nor the defense.  Again, this came up in the Manafort trial.

As I indicated above, Manafort's defense team chose not to put on a case at all.  And it is often the case that the defendant choses not to testify.  The simplest and most obvious interpretation of tactics like this is that the defendant is guilty and the choice made is an attempt at damage minimization.  But that is not "no response" nor a "neutral interpretation".

And there are situations where these tactics are absolutely the right tactics for the defense to use.  Assume that the defendant is innocent and that the prosecution failed to prove one or more elements of each count the defendant is charged with.  Then the defense should win without putting on a case.  If they instead put on a case something could go wrong.  So their best choice is to do nothing.

In fact that was the situation in the trial I was a juror on.  There were two possible perpetrators and at the end of the prosecution's presentation no evidence had been introduced to show that the defendant was the actual perpetrator.  So it would be entirely reasonable to assume that the other possible perpetrator did it.  So I was mystified when the defense didn't point this out and ask for an immediate dismissal.

And, in a perfect example of a situation where the defendant shouldn't testify nor even put on a case, our defendant went on the stand and said "I did it".  He then went on to claim mitigating circumstances but we didn't buy them and he was convicted.  Had the defense not put on a case I would have voted for acquittal.

So if I was on the Manafort jury I would have asked "did the prosecution prove all elements of all crimes charged in all counts"?  Based on what I saw in the press I would likely have said "yes".  Then I would have asked myself "is there any alternative interpretation of the evidence presented that would lead to 'not guilty'"?  Again, looking from the outside, I didn't see any (but I will come back to this below).

Finally, I would have asked myself about mitigating circumstances.  Usually the defense uses their presentation to raise any possible mitigating circumstances they can think of.  But remember they didn't put on a case.  I think it would be an appropriate "no response" or "neutral interpretation" to conclude that none of the usual mitigating circumstances were present in this case.

But I am a rationalist and a "guided by the evidence" kind of guy.  So that's why I think the way I think.  But not everybody thinks the way I do.  But before I go there I am going to take a digression.  There is another reason evidence might be absent.

Let's say one side does a bad thing.  Say, for instance, that the cops break into someplace without a search warrant and, as a result, find a bunch of incriminating evidence they wouldn't otherwise have known about.  Then they cover this up, somehow get a search warrant for the place they have already searched, and go back where, surprise, surprise, they find a bunch of incriminating evidence.  Obviously, they are not supposed to do this.

So let's say the judge becomes aware of all this.  Then what?  The usual response is for the judge to suppress all the evidence gathered in this manner.  Now remember that there's nothing wrong with the evidence itself, just the way the cops got their hands on it.  There is still a good societal reason to make the cops do things properly.  In other scenarios it is the defense that does something bad.  No matter which side does it the judge needs to be able to do something to rebalance the scales.

The judge has only a few crude tools with which to do this and suppressing evidence is one of them.  But for this tool to be effective jurors need to go only by what they hear in court.  The action the judge takes is supposed to weaken the side he takes it against.  But if jurors go out on the own time and snoop around they can find out about this sort of thing.  And that means that one of the few tools that a judge has to maintain an overall degree of fairness is lost.  This is an aspect of our judicial system that is much underappreciated.  Nevertheless, it is an important one.

Back to "reasonable doubt".  I keep bringing it back to scenario building.  What if this happened?  The bias is supposed to be in favorable of acquittal.  If there is a "reasonable" scenario that is consistent to the extent possible with the evidence introduced into court then the defendant should be acquitted, at least on that count.  So what's reasonable and what's unreasonable?

I've already tipped my hand on one aspect.  It's unreasonable if it contradicts otherwise uncontradicted evidence introduced at trial.  I also eliminate the magical and the supernatural from consideration.  God could have caused someone to see "this" when "that" is actually what happened.  I don't include divine intervention in what I consider reasonable.  But others might.  I also eliminate interventions that, while not originating from a divinity, do require the violation of the laws of nature as I understand them.

Finally, I expect a scenario worth consideration to be supported to some extent by evidence introduced at trial.  Suppose someone is being tried for bank robbery.  Well it is possible that a rival gang actually pulled the job but, for reasons unknown, planted evidence to point to the defendant.  It's possible.  But was any evidence introduced at trial for, say the existence of a rival gang with a grudge?  If not, then I don't think such a scenario can be used as a basis for finding reasonable doubt..

But what about something like a "they picked on me" defense.  The Manafort defense argued that the government, as a result of the Trump Russia investigation, chose to single Manafort out for special scrutiny.  If they hadn't done this they probably would not have noticed his crimes and, thus, not charged him.  I'm of the "if you don't want to do the time then don't do the crime" school in this case.  But I presume the holdout juror bought this argument.

And there are circumstances where I could see myself buying it.  Studies show that drug use is about the same among blacks and whites.  But blacks are something like ten times as likely to end up in jail as a result of a minor drug beef.  In the case of the blacks who were convicted the whole "time - crime" thing definitely fits.  But what is really going on is that cops go looking for blacks involved in drugs while at the same time looking the other way when it comes to whites and drugs.  More importantly to me, Manafort at considerable expense assembled a crack legal team.

This option of assembling a crack legal team is not available to most low level black drug offenders.  They do not have the financial resources available to do so.  So I would tend to be much more willing to be swayed by a "they picked on me" defense in situations where the defendant has scant financial and other resources and much more unwilling to do so in the case of someone who is well healed and/or well connected.

So the "they picked on me" defense is one where reasonable people can come to opposite conclusions.  Or even the same person (me) can come to different conclusions based on the situation.  And there are other arguments that are not based strictly on the facts of the case that can sometimes be compelling and sometimes not.  And that, at its core, is why there is no agreed on definition of "reasonable doubt".  I invite my readers to spend some time thinking about what definition of "reasonable doubt" works for them.  After all, they might end up on a jury at some point.

And that is one of the best things about the jury system.  I was impressed by the seriousness with which almost all prospective jurors take their duties and responsibilities.  Yet jurors are the cannon fodder of the legal system.  They are often seen more as a necessary evil than anything else.

They get a few bucks and a bus pass for taking a day (or sometimes several days) out of their lives.  They are supposed to shut up, go where they are supposed to go, do what they are told to do, and otherwise just stay out of the way.  They are suppose to know nothing about the case when they first file into the court room.  In a high profile case this eliminates anyone civically inclined enough to follow the news.  They are not supposed to know any of the judges, lawyers, defendants, etc. involved.

They are asked very personal and sometimes embarrassing questions during "voir dire", the questioning before their selection that is supposed to decide their suitability as honest, upright, and responsible citizens.  If the case goes on for more than a day they are supposed to avoid any press coverage.

They are even supposed to ignore what happened right in front of them (and was often said solely for their benefit) if the judge says "please disregard what you just hard".  Lawyers sometimes want to get certain ideas into jurors' heads but the rules of procedure don't give them an opportunity to do so.  So they "make a mistake" and say or do something they are not supposed to.   The toothpaste can't be put back into the tube.  So the judge says, in effect, "pretend the toothpaste never left the tube".  Only he uses the "please disregard" legal formulation instead.  (Manafort's defense team did this and , for the most part, got away with it.)

It's a tough job but one most people are willing and eager to undertake.  Most shirkers shirk only because they literally can't take the time off.  There is no one else to take care of the kids.  Or they would get fired if they took time off work.  In many low wage jobs they would lose a day or more of pay even if they didn't get fired.  This is a big deal for a shocking number of people as they live from paycheck to paycheck.

Yet people show up and they try their hardest to do a good job.  Even the people who screw it up by doing something differently than I would are doing their best to do right as they see it.  That is literally all you can ask of anyone.  It's also far more than we get from many of the people who are supposed to be dedicating themselves to looking out for our best interests to the exclusion of their own.

Thursday, August 16, 2018

A Compact History of Computers

For a long time I have thought of myself as a computer guy.  I took my first computer class in 1966 and immediately fell in love.  I then spent more than forty years closely involved with them, first at school and then at work.  I write blog posts about what interests me.  So you would think I would have written a lot about computers.  But I have actually written less than you might think.  And it turns out I have never directly addressed this subject.

Here's a complete list of my computer and computer adjacent posts.  (Normally I would include a link for each but since I have so many I am just going to list the publication date.  You can easily locate all of them by using the handy "Blog Archive" at the right side of each post because it is organized by date.)  So here's the list:  12/21/2010 - Net Neutrality; 2/19/2011 - Artificial Intelligence; 7/30/2013 - Home Networking; 1/24/2014 - Windows 8.1 - try 1; 9/16/2015 - Hard Disks; a 7 part series running from 9/24/2015 to 9/30/2015 on the Internet and networking; and 5/19/2018 - Computer Chips 101.  And I have purposely left one out.  It is my first post on the subject, and the one that is most closely aligned with this subject.  On 10/26/2010 I posted "Computers I have Known".  So that's the list.  Now to the subject at hand.

Most accounts of the history of computers credit a machine called ENIAC as the first computer.  There used to be some controversy about this but it has mostly died down.  But I think it is the correct choice.  (I'll tell you why shortly.)  But before I spend time on ENIAC let me devote a very few words to prehistory.

Perhaps the first digital computational device was the abacus and it did influence computer design.  Then a fellow named Charles Babbage designed two very interesting devices, the Difference Engine (1822) and the Analytical Engine (1837).  He never came close to getting either to work but the Analytical Engine included many of the features we now associate with computers.  But, since he was a failure, he and his work quickly fell into obscurity and had no impact on the modern history of computers.  He was eventually rediscovered after computers had been around a while and people went rooting around to see what they could find on the subject.

In the early twentieth century various mechanical calculating devices were developed and saw widespread use.  These gave some hint of what could be done but otherwise had no influence on later developments.  In the years immediately preceding the construction of ENIAC several interesting devices were built.  The Harvard Mark I is given pride of place by some.  The World War II code breaking effort at Bletchley Park in the United Kingdom spawned the creation of a number of "Colossus" machines.  But they were highly classified and so no one who worked on ENIAC or other early computers knew anything about their design or construction.  So where did ENIAC come from?

It arose out of World War II work but not cryptography.  Artillery field pieces came in many designs.  In order for shells to land where you wanted to they had to be aimed.  To do this a "firing table" had to be developed for each make and model.  If you want this type of shell to land this many yards away then you need to set the "elevation" of the gun to this many degrees.  Once you had fired the gun with a few settings mathematics could be used to "interpolate" the intermediate values.  But with the many makes and models of guns that saw use and with the other variables involved a lot of mathematical computations were necessary.  The US Army literally couldn't find enough "computers", people (usually women) who could and did perform the necessary mathematical computations, to keep up with the work.

Two University of Pennsylvania Electrical Engineering professors named Eckert and Mauchly set out to solve this problem by building a machine to crank these calculations out quickly and accurately.  They lost the race in that ENIAC was not ready soon enough before the end of the War to do what it was designed to, crank out firing tables for the Army.  But in the end that didn't matter.  People found lots of other uses for it.  One of the first tasks it completed was a set of computations used in the design of the first Atomic Bombs.

ENIAC was constructed as a set of semi-independent functional units.  There were units for mathematical functions like addition, multiplication, division, and square root.  There were "accumulator" units that could remember a number for a short period of time.  There were units that could be used to feed lists of numbers into the machine or to print results out.  And so on.  And the machine was not programmed in the modern sense.  To perform a calculation you literally wired the output of one unit into the input of another.  Only simple computations, those necessary for the calculation of firing tables, were even possible.

So the first step was to structure the problem so that it was within the capability of the machine.  Then a plan for appropriately wiring the functional units together in order to perform the necessary computations was developed.  Then the functional units were wired together using hundreds, perhaps thousands, of "patch" cables, all according to the specific plan for the current computation.  Then the whole thing was fired up.

It might take a couple of days to design the calculation, a day to wire up the ENIAC, and several hours to repetitively perform the same calculation over and over, feeding a few different numbers in to each cycle, so that each cycle calculated, for instance, all the numbers to complete one line of the firing table for a specific gun.  ENIAC was able to perform computations at a much faster rate than computers (i.e. people) could.  That was amazingly fast at the time but glacially slow compared to modern machines.  But it was a start.

And if this doesn't sound like what you think of when you imagine a computer, you are right.  ENIAC was missing several different kinds of functional units we now expect to find in even the simplest modern computer.  But it rightly deserves its place as "the first computer" because the designs for all the modern devices we now call computers descended directly from ENIAC.

ENIAC was missing three kinds of functional units now deemed essential.  The first one is the simplest, the "bus".  A bus is an Electrical Engineering term that far predates ENIAC.  The idea is that you have a bar, wire, or set of wires, that connect multiple units together.  All the units share the same bus.  But a bus design allows you use the bus to connect any functional unit to any other functional unit.  With ENIAC a serial design was used instead.  The approximately forty functional units were laid out side by side (the size of the room dictated that they were actually laid out in the form of a large "U") and only functional units that were close to each other could be connected together.

Later computers had a bus (and often several busses) incorporated into their designs.  This allowed much more flexibility in which functional units could be connected together.  There is a disadvantage to this design idea.  If two functional units are using the bus all other functional units must be disconnected from it.  At any single point in time all but two units are completely cut off from communication.

With the ENIAC design many pairs of functional units could be connected together at the same time.  They always stayed in communication.  But it turned out the flexibility and simplicity of the bus was more advantageous than disadvantageous.  (And designs incorporating multiple buses allow multiple parallel connections, at least in some cases.)  Switching to a bus design from the serial design was an easy change to pull off.

The second type of functional unit ENIAC lacked was memory.  ENIAC did incorporate a small number of "accumulators" but these could only be used to store the intermediate results of a longer, more complex computation.  They couldn't be used for anything else and they were very expensive to build.  Computer designers recognized that memory, lots of memory, was a good idea.  But it took them a long time to find designs that worked.  At first, various "one off" approaches were tired.  Then the "mercury delay line" was invented.

A speaker pushed pulses of sound into one end of a tube filled with Mercury.  A microphone at the other end picked up each pulse after it had traveled the length of the tube.  And, since under these circumstances the speed of sound is a constant, it took a predictable amount of time for a specific pulse to travel from speaker to microphone.  The more pulses you wanted to store at the same time the slower things went.  You had to wait for all the other pulses to cycle through before you could pick off the pulse you wanted.  If this design sounds like it reeks of desperation, that's because it did.  But it was the memory technology used by Univac (see below) computers.

After a lot of work Mercury Delay Lines were supplanted by "Ferrite Core" memories.  Little magnetic donuts with wires strung through their centers formed the basis of these devices.  By cleverly strobing high power signals through the correct wires a single specific bit could be "set" or "reset".  By cleverly strobing low power signals a single specific bit could be "read".  This technology was faster and it was "random access".  Any individual bit could be read or written at any time.  But it was slow and expensive compared to the modern solution.  The memory problem was only solved when integrated circuit based memory modules were developed.  They allowed large (gigabyte) fast (gigahertz) cheap (less than $100) memories.

But computers with a small (by current standards) but large (by ENIAC standards) amounts of memory were developed within a few years.  That left the logic unit, sometimes called the sequencer.  Functional units were physically connected together using patch cables in ENIAC.  This involved a slow and error prone process.  If the design was changed to incorporate a bus and if each input interface and output interface of each functional unit was connected to the bus then anything could be connected to anything.  But, as I indicated above, only two at a time.

The logic unit sequentially decided to connect this pair to the bus then that pair to the bus, and so on.  This permitted complete flexibility (within the limits of the hardware) in terms of how the functional units were connected together.  Initially this resulted in a slower machine.  But the increased flexibility got rid of all the rewiring time and greatly reduced the planning time.  And it permitted faster simpler designs to be used for the functional units.  In the end this simpler design resulted in faster machines.

And, as the amount of memory available grew, it was quickly determined that the wiring instructions could be stored in memory as a "program".  This required a more complex sequencer as it had to be able to decode each instruction.  But it again speeded up the process of going from problem to results.  It only took years for all these pieces to be designed, built, and put to good use.  And the reason for this is one of the prime motivators for this post.

Once the ENIAC was built a lot of the details of its design became widely known almost immediately.  This let people focus on making one aspect of the design better.  They could just plug in the ENIAC design for the rest of their machine.  ENIAC was a revolution.  These other machines were an evolution.  And evolution can move very quickly.

The same thing happened when the Wright Brothers flew the first complete airplane in 1903.  As an example, there was a guy named Curtis who was a whiz with engines.  The engine in the Wright plane wasn't that good.  But Curtis could basically take the Wright design and plug his much better engine into it.  So he did.  This resulted in a lot of bad blood and law suits but, for the purposes of this discussion, that's beside the point.

The airplane evolved very quickly once a basic design was out there as a foundation to build on.  World War I saw the airplane evolve at warp speed.  Better engines, better wings, better propellers, better everything, were quickly found and incorporated.  The airplane of 1919 bore only a faint resemblance to the airplane of 1914.  And this was possible because different people could come up with different ideas for improving one or another facet of the overall design and then plug them into an existing design.

The same thing happened with computers.  Pretty much every part of ENIAC needed radical improvement.  But, as with airplanes, an improvement in one area could be plugged into an already existing design.  By 1951 everything was in place.  And that allowed the introduction of the first mass production computer, the Univac I.  Before Univac each computer was hand built from a unique design.  But several substantially identical Univac I machines were built.

At this point "peripheral devices" started to proliferate.  The Univac relied primarily on spools of magnetic tape mounted on tape drives.  The drive could under programmatic control speed to a particular place and read or write a relatively large amount of data relatively quickly.  Over time other types of devices were added to the peripheral pool.  And for comparison, the Univac featured 1000 "words" or memory, each big enough to hold a 12 digit number.  And, as with all subsequent designs, both programs and data were sored side by side in this memory.

Univacs were quite expensive and less than 50 were every built.  But they demonstrated the existence of a market.  Other companies quickly jumped in,  The most successful was IBM.  IBM pioneered a number of technical innovations.  They were among the first to hook a "disk drive" to a computer, for instance.  But IBM was the first company to successfully crack the marketing problem.  They were the best at selling computers.

It may seem obvious in retrospect but computers of this era were very expensive.  Soon a lot of companies came to believe that if they didn't get one they would be left in the dust by their competitors.  But the two industries where computers could obviously do the most good were banks and insurance companies.

Both needed to perform vast numbers of boring and repetitive computations.  And that was just what best fit the capabilities of early computers.  Not to put too fine a point on it, but neither banks nor insurance companies employ large numbers of rocket scientists or other "tech savvy" people.  The people who actually ran these companies, senior executives and members of the board of directors, were scared stiff of computers.

IBM set out to bypass all the people in these companies that would be actually responsible for the installation and operation of computers and instead went directly to these senior executives.  They didn't bother to tout the specifications or capabilities of IBM products.  They knew these people were not capable of understanding them nor did they much care.  What concerned them was "betting the company".

They were afraid that they would end up spending a ton of money on a computer.  Then something terrible would happen involving that computer and the company would go down in flames, all because of something that was beyond the understanding of senior management.  What IBM told these senior executives was "if you buy IBM we will take care of you.  If something goes wrong we will swoop in and fix whatever it is.  Oh, it might cost more money than you had planned on spending, but we promise you that if you go with IBM you will not be putting your company's very existence (and by implication the livelihood of these senior executives) in jeopardy".

And it worked.  In case after case the lower level people would, for instance, say "we recommend GE" or "we recommend RCA".  At the time both GE and RCA were as large or larger companies than IBM.  And both had well established reputations for their technological prowess.  But none of the other companies (and there were several besides GE and RCA) aimed their sales pitches so squarely at senior management.  And in case after case the word came down from on high to "buy IBM anyhow".

And companies did and by the late '60s 80 cents of every computer dollar was going to IBM.  It wasn't that their hardware was better.  It was better in some ways and worse in some ways than the equipment offered by other companies.  But it was good enough.  A saying from the period had it that "no one ever got fired for recommending IBM".   That was true.  And the converse was also true.  People sometimes got fired or got their carriers sidetracked for recommending a brand other than IBM.

It took a long time for the computer industry to recover from the total dominance that IBM held for more than a decade.  But there was one technical innovation that was rolled out by IBM and others at this time that is important to talk about.  That's microcode.

The logic unit/sequencer was by far the most complex and difficult part of a computer to design and build.  It had to take the bit pattern that represented an instruction, break it down into steps of "connect these components to the bus, now connect those components to the bus, now connect these other components to the bus", etc.  It turned out that there were a lot of considerations that went into selecting the correct sequence.  And that made this particular component extremely hard to design and build.  Then somebody (actually several somebodies) had an idea.

What had made the original ENIAC so hard to deal with?  The fact that it had to have customized hand wiring done to it each time a new problem was put to it.  Well, the spaghetti that the sequencer had to deal with seemed similarly complicated.  And if you got something wrong the computer would spit out the wrong answer.  In the ENIAC case you just stopped it, fixed the wiring, and ran the calculation over again.  But once the computer was built there was no even remotely easy way to fix problems with the sequencer.

So several people at about the same time said "why don't we create a computer to run the computer?"  It will run a single special program called "microcode".  If there is a problem and we can change the microcode then we can fix the problem.  And that meant that the sequencer hardware became much simpler.  A lot of the complexity could be exported to the design and coding of the "microcode" program.  And the microcode for a new computer could be emulated on an old computer.  So it could be extensively tested before anything was committed to hardware.

This too sounded like it would slow down things immensely.  But it did not.  The simpler sequencer hardware could be optimized to run much faster than the old more complicated design.  And other tricks were found to make the whole thing go fast in just the same way that replacing the patch cable wiring of ENIAC with a bus and memory resident programs eventually resulted in an increase in computer speed.  By the end of the '60s pretty much all computers used microcode.

Later, ways were found to house the microcode in hardware that allowed it to be updated on the fly.  This meant that microcode fixes could be rolled out well after the hardware was originally manufactured.  Some computer designs have evolved to the point where there are two levels of microcode.  There is the, call it pico-code, that allows the hardware to run multiple versions of microcode that, in turn, implements what appears to be the computer.  This three level architecture is the exception rather than the rule.

The next thing I want to talk about is Microsoft.  Bill Gates was pretty much the first person to figure out that the money was in the software, not the hardware.  When IBM rolled out it's "System 360" family of computers in the mid '60s it laterally gave away the software.  Their thinking was that the value was in the hardware.  And most computer companies followed IBM's lead.  Hardware was the high value profit-maker and software was a loss leader afterthought that you threw in because it was necessary.  Gates was the first person to focus on the word "necessary".

Microsoft was a software company from the get go.  Their first product was a BASIC interpreter for the first generation of personal computers.  At the time you were expected to buy a kit of parts and assemble it yourself.  But almost immediately it became obvious that people were willing to pay extra for a pre-assembled computer that they could just fire up and use.  Either way, however, they still needed Microsoft's BASIC.

Microsoft branched out to other software products, most famously MS-DOS and later Windows.  And they do sell a small line of hardware, keyboards, mice, the odd tablet, etc.  But, unlike Apple, Microsoft has not seriously (some would say successfully) gotten into the hardware business in a big way.  Once PCs took off in a big way Microsoft took off in a big way.  And many other companies have successfully followed this model.  Even Apple has outsourced the manufacture of its extensive line of hardware.  They still do, however, do their own hardware design work.  And they never outsourced their software work.

And even "it's the hardware, stupid" types like IBM have been forced to follow suit.  They were initially forced by anti-trust litigation to start selling their "System 360" software.  From this modest start they have continued to evolve away from hardware to the point where they are now almost entirely a services company.  Over the years they have sold off or shut down most but not quite all of their once very extensive hardware business.  So they do still sell some hardware but it now represents a very small part of their total revenue.

I now want to turn to a product that has been pretty much forgotten about.  A man named Phillipe Kahn started a company called Borland at about the time the first IBM PC was released in 1981.  In 1984 he released a product called Turbo Pascal.  You could buy the basic version for $49.95 or the deluxe version for $99.95.  It was organized around a once popular computer language that has pretty much fallen out of favor called Pascal.  I am not going to get into the differences between Pascal and the well known "C" programming language.  One is better than the other in this or that area but, over all, they are actually pretty similar.   So what did you get for your $49.95 (the version most people bought)?

You got an "integrated development" package.  You could use it to write or modify a Pascal program.  You could then literally push a button and your program would be compiled (turned from a vaguely English-like thing that people could deal with into programming instructions that the computer could deal with).  And the Pascal compiler was lightning fast, even on the PC of this era.  (The process typically took only a few seconds.)  Then (assuming compiler had come across no obvious errors in your program) you could push another button and run your program.

If errors were found by the compiler you were automatically popped back into the "edit" environment.  You could make changes and then immediately recompile your program.  And the package offered similar options for fixing your program after it had compiled cleanly.  If your program seemed to be misbehaving you could run it in a special "debug" mode.  This allowed you to step by step work your way through the execution of your program a line at a time.  You could even examine the current value of variables you had defined for your program to work with.

Once you had seen enough you could pop back to "edit" mode, make modifications, and go through the whole compile/execute/debug cycle over and over as many times as needed to get your program working the way you wanted it to.  Then you could sell your program.  And you cold sell just the executable version that did not disclose the original Pascal "source code" of your program.

With Turbo Pascal and a PC you could go from edit to compile to debug and back to edit within minutes.  This had a profound impact on computer software.  ENIAC required smart, skilled, highly trained people to operate it.  Univac made things easier but it was still very hard.  The IBM 360 made things still easier but the cost and skill level was still very high.  And a single edit/compile/execute/debug cycle could often take all day on any of these machines.

Then there was the snobbery.  The bank I worked for in the late '60s required all of their computer programmers to have a 4 year college degree.  It was presumed that only very smart people (i. e. college educated) were up to the task.  But with turbo Pascal a whole crop of housewives, clerks, blue collar workers, and kids were able to master Turbo Pascal and create interesting, useful, and most importantly, valuable computer programs.

It completely democratized the whole software development process.  It turns out that the only attributes a person needed to become successful in the computer business was a knack for computers, a little training (the documentation that came with the Turbo Pascal package consisted primarily of a single not very big book), and access to now quite inexpensive and ubiquitous home computer equipment.  Not everybody is cut out to be a computer expert but a surprising number of people can master the subject.

And that's about where I would like to leave it off.  Pretty much everything that has happened since is the extension of a trend or movement started during the time period I have covered.  Computers have now gotten faster, more powerful, lighter, more compact, and more portable.  But that's just more of the same.

The hardware has gone from vacuum tubes (essentially light bulbs with extra wires in them) to discrete transistors to integrated circuits (the ubiquitous "chip") but integrated circuits were in wide use before 1980.  Even the Internet is an extension of and an enhancement to the ARPANET, a project that was begun in the late '60s.  And it turns out that people had been connecting computers to networks since well before ARPANET.

I would like to leave you with one last item, well more of a musing.  Since the early days computer components have been divided between hardware and software.  The idea is that the actual parts used to assemble a computer from are hard or, more importantly, hard to change.  Computer programs on the other hand are soft.  They are malleable and easy to change.  But it turns out that actually the opposite is true.  Hardware is easy to change and software is hard to change.

IBM pioneered the idea of an "architecture" in the early '60s when they designed the System 360 family of computers.  Before this every time you upgraded to a new computer you had to redo all the software.  It was presumed that this would not be a difficult or expensive process.  But over time it turned out to become more and more difficult and more and more expensive.

With that in mind IBM designed a family of machines that would all be capable of running the same programs.  They specified an "architecture" that all the machines would adhere to.  The usual reason people replaced computers was because, in the words of an old TV show, they needed "more power".  With the System 360 you could just replace your smaller, less powerful (and less expensive) computer with a bigger one that had more power.  IBM guaranteed you didn't have to change a thing.  All the old software would run just fine on the new hardware.  It would just run faster.

IBM spent a tremendous amount of effort on making sure that the "360 architecture" was implemented uniformly and consistently across all machines in the family.  One of their best people, a guy named Iverson, applied the computer language he had just invented (APL, if you care) to creating models of key components of the architecture that were accurate down to the bit level.  And it worked.

A few years later IBM came out with an upgrade called the "System 370" that was carefully designed to be "backward compatible" with the 360 architecture.  The new line offered additional features but things were carefully arranged so that the old programs would work just fine on the new machines.  So companies were able to easily upgrade to the new machines that, of course, featured more power, without a hitch.

And this became the model for the industry.  The hardware decedents of the System 360 no longer exist.  But software written to the 360 architecture standard (and often quite a long time ago) are still running.  I know because as I am going about my daily business I see all kinds of companies running what I can positively identify as 360 architecture software.  This is made possible by microcode.

Microcode makes it possible for hardware to behave in completely unnatural ways.  The hardware that now runs these 360 architecture programs is the many times descendent of something called a System 38.  The original System 38 bore no resemblance to the grandchildren of the system 360 machines that were in existence at the same time.  But that no longer matters.

In fact, hardware has come a long way since the '60s.  But thanks to microcode the newest hardware can made to faithfully implement the 360 architecture rules so that the old programs still run and still behave just as their programmers intended them to.  And this is in spite of the fact that the hardware that is doing this descended from hardware that was completely incompatible with System 360 family hardware.

Intel developed the first versions of the X-86 architecture in about 1980.  The modern computer chips Intel now sells bear almost no physical resemblance to those chips of yesteryear.  Yet X-86 software still runs on them and runs correctly.  Apple started out using a particular computer chip made by Motorola.  They later changed to a newer, more powerful, and quite different Motorola chip.  Yet they managed to keep the old Apple software running.  Then they made an even more drastic change.  They changed from the Motorola chip to an Intel X-86 family chip.  But they still managed to keep that old software running and running correctly.

It turns out that any individual program, i.e. "piece of software", is fairly easy to change.  But families and suites of software quickly arose.  Many examples of this could already be found when IBM was sitting down to design the System 360 computers.  And these families and suites turned out to be very hard to change.  They behaved in ways more akin to the ways people associated with hardware.  On the other hand, people got very good at making one kind of hardware "emulate", i.e. behave exactly the same as, another quite different kind of hardware.  So hardware started behaving ways more akin to the ways  people associated with software.

This "hardware is soft and software is hard" thing has been going on now for at least a half a century.  But people got used to the old terminology so we still use it.  Is black now white?  Is up now down?  Apparently not.  But apparently Orange is the new Black.

Saturday, August 11, 2018

Global Warming

I haven't addressed this subject in any detail before because for a long time I believed I didn't have anything original to say.  But then I had an epiphany a few days ago.  Most of what is said about the subject boils down to one of two things.  The deniers say "it's all a hoax so you can just ignore the whole thing".  The people who believe it's real say "we have to put things back to exactly the way they were before while there's still time".  Both sides are wrong but the deniers are far more wrong than the believers.

What the believers get right is the fact that it's real.  What they get wrong, at least in what is widely covered in the popular press, is the "okay, so now what?" part.  I am first going to spend a very little time rebutting the deniers.  Then I am going to spend the rest of this post on the "now what" part.

There are dozens of well supported lines of evidence that demonstrate that the deniers are wrong.  I am going to confine myself to only one of them and it is a line of evidence that can be summarized in two words:  "Northwest Passage".

Interest in the Northwest Passage is economically driven.  The most efficient way to move goods, particularly heavy goods, is with boats.  It requires very little energy to move a large ship through water at moderate speed even if this vessel contains a large volume or a large weight of goods.  So where possible shippers transport goods in freighters, ships specially designed to carry goods from here to there.

Nevertheless, there are costs, both in time and money, associated with shipping goods say from China to the East Coast of the US.  If the trip can be shortened either in terms of time or distance, money, a lot of money, can be saved.  It would be really nice (as in "save a lot of time and money") if goods could be gotten from one side of the Americas to the other by a much shorter route than has been available.

This was true two hundred years ago when sailing ships had to go south around Africa or South America to get from the Pacific to the Atlantic, or vice versa.  A little over a hundred years ago the Panama Canal was built to provide just such a short cut.  It was fantastically difficult and expensive to build and moderately expensive to operate but it was a bargain anyhow.

But the Panama Canal could only accommodate ships up to a certain size ("Panamax") and it was a hassle to transit the Canal even if your ship fit.  Nevertheless the Canal was a roaring success from the day it opened.  It was such a success that Panama has recently completed a large engineering project to upgrade the Canal so that larger ships, so called "Super Panamax" ships, can now use it.

But what if there was a passage around the north end of the Americas?  People recognized that this would be a very valuable addition to the options for getting from one side of the Americas to the other.  So they started looking for a "Northwest Passage" around the north end of the Americas several hundred years ago.  Many expeditions set out to find such a route and many lives were lost in the search.  As recently as the 1960s Exxon, the giant oil company, used a specially modified oil tanker to try to force such a passage.  Alas, like all previous attempts it failed.

And then something magic happened.  The "Northwest" Passage" magically opened up one summer a few years ago.  After some fits and starts (it didn't open up every year for a while) it is now a standard feature of Arctic summers.  Enough of the pack ice that surrounds the North Pole melts every summer so that an ice free passage opens up around the north end of North America.  It has now been several years since we have had a summer in which it didn't open up.  So what changed?

Global Warming changed.  If a Northwest Passage opened up at any time in the last 250 years someone would have noticed.  No one noticed until a few years ago because it did not happen before a few years ago.  But now the far north is a lot warmer than it used to be.  It is enough warmer that the ice that is only a little south of the North Pole melts around the edges enough to produce open water every summer.  (FYI, the North Pole sits in the middle of something called the Arctic Ocean.  Underneath all that ice is ocean water.)

Anyhow, enough ice melts that there is an ice free path all the way around the north end of North America every summer.  It now shows up regularly enough that shipping companies can depend on its appearance and route freighters through the Northwest Passage for a while every summer.  The situation is now so predictable that some adventurous recreational boaters transit it for fun.

And only a large and sustained warming of the entire north could cause the Northwest Passage to open up at all.  And all the weather data supports the idea that the entire north is now much warmer than it used to be.  In theory it could be just the north.  But it isn't.  So the "Northern Warming" is just part of a much larger "Global Warming" phenomenon.

And with that I am going to leave that subject and move on to the subject of "so what".  Given a chance, scientists would primarily focus on this question.  They would ask the twin questions of "how big a deal is it?" and "what can be done about it?"  But the well organized "deniers" operation has been so successful at sowing confusion and distrust that most scientists are forced to spend most of their public facing time trying to break through the denial.  This leaves them with little time for these other questions.  But they have spent some time on both of them.

I am also going to give the "how big a deal is it" question short shrift.  It's a really big deal.  As I write this giant wildfires are burning in California and elsewhere.  And large populations are being subjected to record heat which is causing a lot of misery and some deaths.

Scientists have been making and tracking predictions of such things as average temperature rise and average ocean rise for a couple of decades.  It has turned out that they have generally underestimated the amount of change.  The actual numbers have come in mostly toward the "high" (bad) end of their forecasts.  This is because scientists have bent over backward to not exaggerate the amount of change they expect.  Even so they tend to get beaten up for forecasting any change at all.

The whole impetus behind the manufactured "controversy" about Global Warming is driven by the question I am now going to focus on:  now what?  If we accept the reality of Global Warming we are forced to also accept that bad things are going to happen.  And the obvious way to avoid bad things happening is to change things so that Global Warming doesn't happen.  And those changes don't affect everybody equally.  People don't like change.  Large businesses that depend on things being a certain way, and most large businesses find that this applies to them, don't want to change because it looks like the required change would be detrimental to their interests.

There are a lot of smart people working for the fossil fuel industry, especially the oil industry.  It took these people no time at all to figure out that Global Warming was bad for business as usual for these industries.  And there was no simple, reliable, obvious way to cash in on the changes they foresaw so it was all bad news.

So it should come as no surprise that funding for and support of the "denialist industrial complex" has come primarily from Exxon and other large oil companies.  They quickly figured out that if they could delay things, even for a while, a lot of money could be made.  This is a model pioneered by the tobacco industry.  They put off effective regulation of cigarettes for decades and made tons of money as a result.

But big oil is not alone.  Global Warming will eventually force change onto all of us.  And from our current vantage points it looks like the change will be more for the bad than for the good.  So why shouldn't people follow in the footsteps of big oil, and before them big tobacco?  A lot of people have consciously or unconsciously decided "what's good for big oil is good for me" and joined the denialist camp.

And for a long time it was hard to come up with a counter argument.  When Al Gore put out his movie "An Inconvenient Truth" in 2006 he only devoted a few minutes at the end to what could be done to improve things.  The rest of the movie was well constructed and very convincing.  But I found this end section not very convincing.  My reaction boiled down to "oh, shit - we're in for it now".

Fortunately, things now look a lot better.  Things that looked like Science Fiction then (cheap wind power, cheap solar power) are Business Reality now.  Solar and wind are now your cheapest alternatives if you need to bring new electrical generating capacity online.  So there is now some light at the end of the tunnel but it is pretty dim and looks to be a long ways away.

In addressing the impact of Global Warming let me step back for a minute.  I recently read a book about Chief Seattle, the man the city I live in is named after.  It went into some detail about how local Indians lived before the white man arrived.  They were (to somewhat oversimplify) hunter gatherers.  Over the course of a year they would move several times following the resources.  When the berries were ripe they lived here.  When the salmon were running they lived there.  That sort of thing.  Now imagine that the environment changed.  If the carrying capacity of the overall resource base stayed the same they would just have shifted their annual migratory pattern a little and life would have gone on pretty much the same.

But as the intensity with which mankind exploits resources has increased we have lost the kind of flexibility hunter gatherers had.  We now expect things to be a certain way with a high degree of precision.  Take the Colorado River as an example.  It flows through a large area that is naturally very dry.  But our modern way of living demands a lot of water.  And for a lot of communities the Colorado River was the place to go to satisfy that need.  It quickly became obvious that if everybody drew all the water they wanted the Colorado, a rather large river, would go dry.

So the Army Corps of Engineers did a study and determined the average annual flow of the Colorado.  This number was the foundation of a large number of agreements as to who would get how much of the Colorado's flow.  I am going to skip over a lot of hanky-panky and focus on just one observation.  The study was flawed in a critical way.  The Corps study just happened to encompass a time period in which the flow was unusually high.  So the number that is the foundation of all those agreements, the average annual flow of the Colorado, is too high.

But literally millions of people and many businesses large and small now depend critically on getting the full amount of water the agreement promises.  They are now locked in to that critical number that is impossible to achieve.  In our modern world there is no simple and low impact change like redoing an annual migratory pattern.  If nothing else, most people don't routinely pick up and move several times per year.

We see this rigidity show up all over the place.  The North Slope is a place in the far north of Alaska.  There is a bunch of oil equipment necessary to operate the Alaska oil pipeline located there.  The equipment depends on the permafrost in the area staying frozen all year round.  It did a couple of decades ago when all this was set up.  But it now melts every summer due to the fact that the north is now a lot warmer than it used to be.  That turns out to be a giant PITA.

In my neck of the woods we depend on glaciers.  Glaciers store up a lot of water that comes down in the mountains in the form of winter snow.  Then they release it all summer long as the warm weather causes the snow to slowly melt.  But the glaciers are shrinking so we get more flooding in the spring (less snow stays in the mountains where we want it) and water shortages in the summer (less stream flow from smaller glaciers).

These are just two easily explained examples I can come up with off the top of my head.  But in ways known and unknown we as a society depend on things staying precisely as we expect them.  But Global Warming is changing things.  And it's not just Global Warming.  People forget that we as a society are changing lots of things all the time.  (And the deniers are happy to confuse things by mixing it all together.)

There is a sad story about a mother Killer Whale playing itself out as I write this.  Her newborn calf died, likely of malnutrition.  She has been carrying it around with her for a couple of weeks now and this behavior will likely result in her death.  What's going on?  Well, we have overfished for decades and we have thrown dams across rivers that salmon spawn in, also for decades.  The result is small salmon runs and that results in starving Killer Whales.

Neither the dams nor the overfishing have anything to do with Global Warming.  Nor does draining swamps.  But we have also been doing lots of that for a long time.  Back in the day if a stream or river rose much above its normal level it would overflow into swampland and the severity of the flooding was limited.  But we have drained the swamps and put up levies.  So now when flow increases it has no where to go.  So we get floods of a size and frequency that never occurred before.  Again, we have done two things.  We have diminished the ability of nature to moderate extreme events.  And we have diminished our ability to ride through these kinds of events without damage.

So this means we're all going to die, right?  No!  An "existential" crisis is one that might wipe us all out.  Global Warming will not wipe us all out.  It is not a "meteor crashing into the Yucatan Peninsula and wiping out all of the dinosaurs" out kind of thing.  A better description of its impact is "inconvenient".  Things are going to change whether we like it or not.  And we won't like it.  But it won't wipe us out or even come close.  That's the good news.  So what's the bad news?

Well, as one might expect there's lots of bad news.  And the first piece of bad news is it's inevitable.  We have long since passed the point where it is possible to avoid Global Warming entirely.  But wait, there's more.  It hasn't even gotten as bad as it is going to get.  Imagine you are stopped at a light in your car and the light turns green.  You mash on the gas pedal and the car takes off.  (I know you might not mash on the gas pedal but go along me to the extent of pretending that you do.)  The car quickly accelerates and is soon going at a pretty good speed.

Now let's say you take you foot off the gas.  What happens?  The car keeps going.  It may start slowing down but if you don't put your foot on the brake it is going to continue on for a good distance.  This is inertia in action.  And it turns out the environment has a lot of inertia built into it.  So even if we stop doing what we are doing to produce Global Warming there are certain inertias that are going to keep on for quite a while.  And they will continue to make things worse.

But that's not what we have done and are now doing.  We have not taken our foot off the Global Warming gas pedal.  At best what we have done is backed off a little so the pedal is no longer all the way down to the floor.  It's just most of the way down to the floor.  As a result Global Warming is still building up speed.  It's just building up speed a little more slowly than it would if we still had the pedal mashed to the floor; if we hadn't installed all that solar and wind generating capacity, for instance.  We have also done some other things like switching to cars that pollute less.  But all this together has just backed things off a little.  And it certainly is a long way from anything that looks like applying the brakes.

The second piece of bad news is that Global Warming does not affect everything equally.  When scientists started trying to predict the effects of Global Warming they had only simple tools.  So they did what they could do.  They developed a number of "global average" numbers.  The global average temperature will go up this much.  The global average sea level rise will be this much.  That sort of thing.  But turns out no placed is average.

The first big example of this is the arctic, the location of the Northwest Passage and all that other stuff I was talking about.  Scientists noticed that it was warming faster than the rest of the world.  They have now spent a lot of time studying this.  I am not going to go into the details but it turns out that the arctic is more sensitive than other areas.

Scientists now think they have a lot of this additional sensitivity figured out but the bottom line is that every place is special.  So it is going to be more sensitive to this and less sensitive to that.  Scientists have now turned to trying to generate local forecasts that are tailored to the specifics of local areas.

Let me give you just one example.  The ocean is warming.  That, and some other things together produce sea level rise.  But the ocean is like a giant bathtub.  The water level rise will be the same everywhere, right?  Wrong.  It turns out that the ocean is a lot more complicated than a giant bathtub.  As a result (the details are too complicated to get into so I'm going to spare you from them) some places will see more sea level rise and other places will see less.

Let's be honest.  We mostly care about our local area.  So these customized local forecasts, which are only now starting to be rolled out, are of great interest to the people who live in a specific area.  But they are also critically important to local governments and the like.  If they know they don't have to worry so much about this but they do need to seriously worry about that then they can make wiser decisions.

So what is going to happen?  The most obvious answer is that it is going to get warmer.  The temperature will go up more in some places and less in others but it is going to go up.  That's pretty obvious.  There has also been a lot of talk about sea level rise.  Again, this will be more of a problem in some places and less in others.  And, as Superstorm Sandy amply demonstrated, a little sea level rise can have a difference all out of proportion to the average number of inches it rises.

So affected areas are going to have to change.  People have spent big bucks on beachfront property because they want very much to be there.  They are not going to be forced out easily.  But proofing their properties against changing conditions or repairing the damage that has and will be done will be fantastically expensive.

And this is typical.  All the trade offs will be bad.  The choice will boil down to which variety of horrible tasting medicine you are going to end up swallowing.  The good news is that there is a medicine that will work.  It just tastes horrible.  It tastes so bad that no one will voluntarily take it.  But at some point we will all be forced to involuntarily swallow one or more horrible medicines.

Scientists and others have been saying for a long time now that "this medicine might seem like it tastes horrible but if you don't take it now you will be forced to later take medicine that tastes even more horrible".  So far this argument, true though it is, has not gained traction.  So what do these even more horrible medicines look like?

I've already mentioned one of them.  As the sea level rises land that is now much sought after will become unusable.  Mostly we have used flood insurance to cushion the cost of rebuilding and Army Corps of Engineering projects to try to protect them from damage in the future.  But properties keep getting wiped out necessitating a round of flood insurance payments (your tax dollars and mine at work), often for the same piece of property.  And Corps projects (more of your tax dollars and mine at work) fail to provide adequate protection.  At some point taxpayers are going to revolt and the money spigot will get turned off.  At that point lots of people will be forced to abandon their property and their losses will not be made up by government or insurance payments.

We have seen the slow squeeze going on with the Colorado River for some time now.  Phoenix is deep in the heart of libertarian "keep your government regulations away from me" country.  Yet Phoenix has draconian water use regulations.  The fact that Colorado River flows have been inadequate to support water deliveries in the volumes that people would prefer for a fairly long period of time has forced this behavior.  And Phoenix residents have decided that somehow these regulations are a feature not a bug in their overall libertarian beliefs.  This sort of thing (draconian regulations) is going to become much more widespread in the future.

Phoenix now routinely sees periods of triple digit temperatures every summer.  They have responded by going "all air conditioning all the time".  That's an accommodation but it means that there are large parts of the year when it is not practical to be out in the open for a good part of the day and perhaps much of the evening.  So far people are putting up with this.  But at some point if things continue on their current trajectory people will decide to relocate to more hospitable climes.

Global Warming and other man made activities are killing off lots of plants and animals.  There has been a lot of recent coverage about what increased water temperatures are doing to corals (bad, very bad), for instance.  This trend will inevitably continue.  And it has generated numerous "save the [fill in the blank]" responses.  But the response has been mostly ineffective.  Some individual species that have attracted a lot of attention and effort have been brought back from the brink.  But we are losing boring species at a fast and increasing rate.  I see no practical way to reverse this.  And I ask a very fundamental question?  What's the value of species diversity?

There is no real answer to this question.  In general people are sure reducing species diversity is bad but have nothing specific they can point to as a justification.  There is a generic "we don't know what we may need in the future" argument.  But again there is nothing specific people can point to.  The practical justification for a small number of species (i.e. Bald Eagles) is that people like and want to preserve them "just because".  And in a limited number of cases that works.  But for the most part species just go extinct.

So what's the "get out of jail" on this problem?  It turns out there is one.  And there is an acronym for it.  The acronym is GMO, which stands for Genetically Modified Organisms.  Humans have gotten very good at being able to tweak plants and animals.  And it's early days.  We are going to continue to get better and better at this.  So if we need some new critter or we need to tweak some characteristic of a current critter we have, or will soon have, the technology to do that.

There is, however, a well organized anti-GMO community.  I think the right gets way more things wrong than the left does but this is one case where the left is more wrong than the right.  There are anti-GMO factions imbedded within both the right and the left.  But they are currently much more able to successfully advance their agenda with the left than with the right.  One of the things we are going to need to swallow, whether we want to or not as Global Warming intrudes further into our lives, is GMO, a lot of GMO.  That is going to make many people very unhappy.

And I have mentioned ice melting and forest fires.  So Global Warming means more droughts, right?  Well, yes in some places (remember the part about effects being unevenly distributed) but mostly it will mean the opposite.  If you warm water and the air above it you get more evaporation.  And "evaporation" is just a fancy term for putting more water vapor into the air.  And that additional water vapor will eventually return to earth in the form of additional rain and snow.

The world will, on average, become a wetter place.  And, of course, the change will be distributed unevenly.  So some places will get a lot more rain and/or snow.  Some will get a little more or the same or a little less.  And some places will get a lot less.  The short version of all this is that scientists predict more extreme weather events.

But people have been coping with bad weather for a long time.  So it will take some adjustment but, from a "big picture" perspective, most of the time it will be more of an inconvenience than a catastrophe.  So what else?

There is the direct effect of things generally being warmer.  Again, the increase in warmth will be greater in some areas than in others.  Phoenix used to be a wonderful place to live, weather-wise.  It had moderate winters, very pleasant springs and autumns and, unfortunately, hot summers.  But the summers used to be annoyingly hot.  Now summers often features several periods when the weather is unbearably hot.

But is it?  Unbearably hot, that is.  The answer is yes if you are acclimatized to the temperate zones.  But humans in parts of Africa have become adapted to living in the outdoors in temperatures that are just as hot as a Phoenix hot spell.   They tend to be tall and thin.  They have very dark, almost black, skin.  And they have short curly hair.  Finally, they wear almost nothing.  They are completely functional in conditions that would render me hors de combat.

What that means is that if we GMO people we can in one generation turn them into people who can thrive in an extremely hot climate (or, if we prefer, a climate that is merely somewhat warmer).  Do most people find this an adaptation they are comfortable with?  Hell, no!  But would it work?  Yes, actually it would.

But if we look at the parts of Africa where people who are adapted to a very hot climate live, they are places that are not very productive from an agricultural perspective.  And that's one of the biggest concerns Global Warming believers have, food adequacy.  If the world average agricultural productivity was reduced to what's found in these areas something like 6 billion people would die of starvation.  That's really bad even if you are one of the billion or so survivors.

If we wait long enough nature will adapt.  But we are already seeing problems all over the place in the short run.  The ranges of many of our high volume foodstuffs like wheat, corn, and rice, are shrinking.  And that means concerns about there being enough food to go around are rising.

Again that whole "fine tuning" thing kicks in.  Humans have developed strains of wheat, corn, and rice that produce bounteous harvests if grown under the conditions they are adapted to.  But these strains are inflexible.  If you change conditions a little the size of the harvest drops a lot.  This inflexibility was done on purpose.  By eliminating the parts of the plant that would allow it to thrive under a broad range of conditions breeders were able to substantially increase harvests for the narrow set of conditions the plant actually encounters.

But now Global Warming is quickly changing conditions.  More and more often these fine tuned strains find themselves being grown in conditions they are poorly adapted to handle.  So productivity drops.  Given time evolution will fix this by reintroducing characteristics that allow the plant to thrive in new conditions and eliminate characteristics the plant no longer needs because conditions have changed.  But this takes a long time when measured in terms of human lifetimes.  If we wait for nature a lot of us will starve to death.  But we don't have to.

We can use GMO techniques to quickly change the characteristics of the plants.  People can force the evolution of plants to move fast enough to keep up with Global Warming.  Nature, can not.  Left alone nature will eventually get there.  But a lot of critters will starve to death while this is happening.  And if we don't intervene a lot of those critters that are starving to death will be people.

Another thing that people can do is to vastly expand irrigation.  And by this I mean moving water from where it is (or will be as Climate Change changes rainfall patterns) to where we need it to be.  A big reason those really hot parts of Africa have such low agricultural productivity is they currently have little or no water.  If you import lots of water and GMO plants so that they like it hot then productivity in those places could eventually match or exceed current "world average" results.

But wait, there's more.  More in the sense that more needs to be done.  Consider our transportation system.  It currently consumes about a third of the fossil fuels we produce.  On paper we know how to cut this by something like 80%, a real gain in the effort to reduce one the principal current drivers of Global Warming.  We go to electric trains, cars, and trucks.  That's fine as far as it goes.  But we currently don't have enough electricity to do this.

Again there is a solution.  We build a lot of new solar and wind generating capacity.  This could actually be done.  Scientists have done the math and there is enough potential solar and wind power to do the job.  It would mean putting up a lot more wind farms.  And it would mean roughly paving over entire states with solar panels but it could be done.  Getting that done requires a lot of money and a willingness to do it but the technical capability already exists.

But if we did that we don't have the capacity in our electrical grid to handle the loads.  So we would need to dump a ton of money into upgrades to our electrical grid.  And then there is the storage problem.  Wind and solar are intermittent sources.  Sometimes they make a lot of electricity, midday in the case of solar.  Sometimes they make little or none, midnight in the case of solar.  Wind has the same kinds of problems.  Sometimes it blows hard.  Sometimes it blows not at all.

Some of this can be fixed by a much more robust electrical grid.  It can be used to shuttle electric power from here where there is currently a surplus to there where there is currently a shortage.  That's a help but it is not enough.  There will be no solar generation anywhere in the US for many hours each night.  The wind blows harder in some seasons and softer in others.  The obvious solution is to store up the surpluses in times of excess then feed them back into the grid during times of shortage.

The more storage capacity you have, the longer periods of too much or too little power that can be handled.  And statistics are on our side.  A small increase in storage capacity results in a considerable increase in the variations that can be handled.  But we really don't have a good technology for storing electric power.

Mostly the talk today is of batteries.  But batteries are extremely expensive and don't store all that much power.  It turns out that an old, low tech, solution called pumped storage can store a lot more power.  You can pump water up from a lower reservoir to a higher reservoir using surplus power.  Then later you can drain it through the appropriate machinery to turn it back into power when you need it.  These systems are more efficient than you would think because engineers have been working with them for a long time and know how to get the most out of them.

For whatever reason, there are only a few pump storage facilities anywhere in the world.  All it takes is money to build more but at the moment no one is interested.  Another possibility is to link power grids around the world.  The sun is shining somewhere on earth at all times.  But this too is an idea no one is talking about.

It is unclear to me how this problem will be solved.  But it should be clear that there are solutions.  If someone comes up with a solution I haven't listed, that's fine.  But even if they don't, the point is that there are solutions that could be made to work.

And that's my point.  Global Warming is real.  It is already with us.  It is going to get worse.  Right now we are making it get worse faster.  We are a long way from the point where we stop making it worse faster, let alone actually making things better.  But even if we stopped making things worse today the built in inertia of the affected systems means things will continue to get worse for some time.  The sooner we accelerate the processes that move us in the direction of making things better, the better.

But no matter what happens humanity will survive.  It is just a matter of how drastic the adaptations we will eventually have to put in place are.  And it is important to realize that the world gross domestic product (the value of all the goods and services produced world wide) is more than seventy-five trillion dollars.  Annual GDP routinely goes up or down by more than a percent.  And since it's normal it results in little or no disruption.  So if we grab 1% of World GDP and spend it on Global Warming reduction or mitigation (unfortunately more likely the latter) it will allow us to spend about a trillion dollars per year.  And the world economy will sail on pretty much without noticing.

You can do a lot with a trillion dollars per year.  And spin offs like new technology or efficiency improvements might even return all or even more than was spent back to the economy as a whole.  The space race in its early years cost the US government about three billion dollars per year.  But the technology spinoffs probably added more than three billion dollars per year to US GDP.  So overall the US was better off.  That didn't mean that the costs and benefits were spread evenly.  They weren't.  And that always makes this kind of argument a hard sell.

But, as the old commercial has it, "you pays me now or you pays me later".  Right now most people in the US are living with the false hope that when the bill eventually comes due they either won't be around or some miracle will have happened that will make the bill go away.  I will probably be dead when the worst of the Global Warming effects kick in.  But I still would rather not leave that kind of legacy to future generations.  It will not kill them but that's small comfort.