Friday, July 2, 2021

A Brief History of the Motion Picture

 This is something I like to do.  I am going to take you on a trip through the history of something.  But all I am going to do is talk about the evolution of the technology that underpins it.  Its positive or negative contributions to society; who does it well and who does it badly; what are good and bad examples of its use; all those questions I leave to someone else.

My subject, of course, is the moving picture.  And even if we include the entire history of still photography the history we will be talking about only goes back about 200 years.  And the technology that has enabled pictures to move has an even shorter history.  For most of this history, the technology involved has been at the bleeding edge of the technology available at the time.    In order to establish some context I am going to start with a necessary precursor technology, photography.

The earliest paintings are tens of thousands of years old.  However, the ability to use technology instead of artistry to freeze and preserve an image only dates back to the early 1800s.  The key idea that started the ball rolling was one from chemistry.  Someone noticed that sunlight alone could change one chemical into another.  It soon became apparent that chemical compounds that contained silver were the best for pulling off this trick.  From that key insight, chemistry based photography emerged.

In the early days it quickly went through several iterations.  But by the middle 1800s one method had come to dominate still photography.  A thin layer of transparent goo was applied evenly to a piece of glass.  This was done in a "dark room".  The prepared glass plate was then inserted into a "magazine" that protected it from stray light.  The "film magazine" could then be stored, transported, and inserted into a "camera".

The meaning of the word "Computer" changed over time.  Originally, it meant a person who performed repetitive arithmetical and mathematical calculations.  In the mid-1900s its meaning changed to instead mean a machine that performed repetitive arithmetical and mathematical calculations.  The word "camera" underwent a similar transformation.

It started out referring to a simple device for focusing an image onto a surface.  By the mid-1800s it began being used exclusively to refer to a device used in photography.  A photographic camera consisted of an enclosed volume that was protected from stray light.  Its back was designed to accommodate the film magazine and its film.

At the front, and opposite the magazine area, was where a lens and a "shutter" were located.  The shutter normally remained closed but could be opened for short periods of time.  This would allow light to pass through the lens and land on the film at the back.

Cameras, film magazines, and the rest were in common use by the start of the Civil War in 1861.  The camera assembly was used to "expose" the film to an appropriate scene.  The film magazine was then used to transport the film back to the darkroom.  There it was "processed" so as to produce the final result.

Exposed film doesn't look obviously different from unexposed film.  Several processing steps are required to produce a picture of the original scene.  In the darkroom the goo side of the film is first exposed to a chemical bath that "develops" the film.  This causes the parts of it that had been hit with light in the camera to turn dark while the other areas remain transparent.  The developed goo is next exposed to a chemical bath containing a "fixer".  This step "fixes" the film so that subsequent exposure to light will not change it.

The result of these processing steps is film with an image of the original scene showing.  But it is a "negative" image.  The dark parts in the original scene are light and the light parts dark.  The image is also a "black and white" image.  It only contains shades of grey, no color.  And while this negative image is apparent and useful in some circumstances, it doesn't look like the original scene.

Fortunately, the fix is simple, put the film through additional processing steps.  Take a photograph of the negative, develop it, and fix it.  The result is a negative of a negative, or a "positive".  Black and white images can be very beautiful and emotionally evocative.  It took more than fifty years for photographers to be able to pull off color photography.

But what we have at this point is "still" photography.  Nothing is moving.  But the first "movie" soon appeared.  The initial method was developed in order to settle a bet.  When a horse is galloping is there any point when all four feet are off the ground?  A group of rich people decided that they were willing to pay good money to find out.

The man they hired tried out a lot of different things.  He quickly concluded that a galloping horse does spend some of its time with all four feet off the ground.  But how could he convincingly prove that?  The obvious answer was photography.  But he found that, while still pictures settled the question, they did not do it in a convincing manner.  More was needed.

So he set up a rig where a galloping horse would trip a bunch of strings.  Each string would be attached to its own camera.  As the horse galloped along it hit each string in sequence causing a photograph to be taken at that point.  One of those photographs showed the horse with all its feet off the ground.  But, as previously noted, simply viewing that photograph was not sufficiently convincing.

He then came up with a way of displaying his still pictures that was convincing.  He set up a device that would flash each still photograph in sequence.  And each photograph would only be illuminated for a fraction of a second.  He set his device up to continuously cycle through the entire set of photographs over and over.

If he operated his device at the right speed the horse appeared to be moving.  More than that, it appeared to be moving at the speed of a normal galloping horse.  By cycling through his roughly dozen photographs over an over he could get the horse to gallop as long as he wanted.  Then he could slow things down and "freeze frame" on the one picture that showed the horse with all four feet off the ground.  That made for a convincing demonstration.

This is considered to be the world's first moving picture.  But, from a practical point of view, its a gimmick.  But something very important was learned.  If you flash a series of pictures on a screen at a the right rate, then the eye working in concert with the brain will stich everything together.  The brain can't tell the difference between a continually moving scene and a series of similar still pictures flashed one after another.

From here it was just a matter of putting all the right pieces together.  The first piece was "celluloid" film.  Cellulose is a natural component of plants.  If you start with the right kind of cellulose and process it with the right chemicals you get a thin sheet of transparent material.  It was possible to manufacture very long ribbons of celluloid film.

The same goo that had been applied to glass plates can be applied to celluloid.  The result is a long ribbon of celluloid film onto which images can be placed.  It is necessary to "advance" the film between exposures so that each separate photograph of the scene ends up on a separate adjacent part of the long ribbon of film.

And celluloid is somewhat flexible.  It could be wound up on a "reel", a spool of film.  It could also be fed through gears and such so that it could be "run" through a "movie camera" or a "film projector".  And it was much cheaper than glass.  It soon became the preferred material to make photographic film out of.  One problem solved.

The next problem was to come up with a mechanism that would quickly and precisely advance the film.  Edison, among others, solved this problem.  The key idea was one that had been around for a while.

If you fasten a rod to the edge of a wheel it will move up and down as the wheel rotates.  More complexity must be added because you want the film to advance in only one direction.  And you want it to advance quickly then freeze, over and over again.  But those were details that Edison and others figured out how to master.

So, by the late 1800s Edison and others were using moving picture cameras loaded with thin ribbons of celluloid film to take the necessary series of consecutive still pictures.  A matching projector would then do the same thing the horse device did, throw enlarged images of each picture on the film onto a "screen" (any suitable flat surface), one after the other.  The projector needed to be capable of projecting consecutive pictures onto the screen at a lifelike rate.  That rate turned out to be 24 frames per second.

And with that the "silent movie" business came into existence.  ("Moving picture" got quickly shortened to "movie".)  At first, a movie of anything was novelty enough to draw crowds to "movie houses", later "movie theaters", and still later just "theaters".  But people's tastes evolved rapidly.

Movies capable of telling stories soon appeared and quickly displaced the older films as the novelty of seeing something, anything, moving on a screen wore off.  "Title cards" were scattered throughout the film.  They provided fragments of dialog or short explanations.  Accompanying music, anything from someone playing a piano to a full orchestra, were also soon added.

The result was quite satisfactory but fell far short of realism.  The easiest thing to fix was the lack of sound.  Edison, of course, is most famous for inventing the light bulb.  It consists of a hot "filament" of material in an enclosed glass shell.  All the air must be evacuated from the shell for the lightbulb to work.  That's because the filament must be heated to a high enough temperature to make it glow.  If there is any air near the hot filament it quickly melts or catches fire.

Edison's key achievement was the invention of a high efficiency vacuum pump.  With a better vacuum pump the filament could be heated to the temperature necessary to make it glow without it melting or burning up.  His original filament material was a thin thread of burnt carbon.  Others quickly abandoned it for Tungsten, but no one would have succeed without the high quality vacuum Edison's pump was capable of.

Edison was an inveterate tinkerer.  Once he got the lightbulb working he continued tinkering with it.  Electricity was used to heat the filament.  It turns out that electrons were boiling off of the filament.  Edison added a "plate" off to the side of the filament and was able to use it to gather some of these electrons.  Moving electrons are what makes electricity electricity.  And this invention, a light bulb with a plate off to the side was the foundation of the electronics industry.

Others took Edison's experiment a step further.  They added more stuff into the light bulb.  If a metal mesh "grid" was inserted between the filament and the plate, then if the grid was sufficiently charged with an electrical voltage it could completely cut off the electron flow.  If it had no charge then the electrons would pass through it freely.  If it was charged with a suitable lower voltage, then the flow of electrons would be reduced but not completely cut off.

Edison's "light bulb + plate" device  was called a diode because it had two ("di" = 2) components.  This new device was called a triode because it had three ("tri" = 3) components.  Charging the grid appropriately could stop and start an electric flow.  Intermediate amounts of charge cold allow more or less flow to happen.  Not much electric power needed to be applied to the grid to do this.  This is a long way of indicating that a triode could be used to "amplify" (make louder) an electric signal.

More and more complex devices were built with diodes, triodes, and newer "tubes", light bulbs with more and more components stuffed into them.  Soon, "electronics" could be made to do truly amazing things.  For instance, a "microphone", invented by Bell, the telephone guy, could be sent through electronics to loudspeakers (invented by lots of people) to create a "public address" system.  Now an almost unlimited number of people would simultaneously hear a speech or a theatrical performance.

Another device Edison invented was the "phonograph".  His original version was purely mechanical.  The energy in the sounds of a person speaking caused a wavy line etched in wax.  Later, a needle traveling along that same wavy wax line could be connected to a horn.  This arrangement would allow the original sounds to be reproduced at another time and place.

This was amazing but ultimately unsatisfactory for a number of practical reasons.   The first thing to be replaced was the wax.  Vinyl was sturdier.  Edison used a cylinder.  That got replaced by a platter.  Finally, the mechanical components got replaced by electronics.

Now a clearer and more complex sound like a full orchestra or a Broadway show could be played and replayed at a later time and in a later place.  Also, the "record" could be duplicated.  Different people could now listen to the same record at different times.   But people could also listen to different copies of the same recording.  A mass audience could now be reached.  By the late 1920s all this was in place so that it could be used to add sound to movies.

And, at first, that was what was done.  A phonograph record containing the sound part to the movie was distributed along with the film.  If the film and the record were carefully synchronized, and if a public address system was added to the mix, then the sound movie became possible.  The first successful example of pulling all this off was The Jazz Singer.

It was terrifically hard to pull off everything that was necessary to create the record.  The process of making the necessary recordings, then combining them appropriately and producing the record was very hard to pull off.  But it also turned out to be hard to keep the film and the record in sync while the movie was playing.

As a result, The Jazz Singer is more accurately described as a silent movie with occasional sound interludes than it is as a true sound movie.  Much of the movie was just a traditional silent movie.  But every once in a while, the star would burst into song.  For those parts the audience heard not local musicians but Al Jolson, the star of the movie.  So, while it wasn't a very good movie, it was a terrific proof of concept.

This process used by The Jazz Singer and other early "talkies" was called "Vitaphone".  The "phone" part harkened directly back to the phonograph part of the process.  But something better was needed.  And it was needed quickly.  The success of The Jazz Singer had caused audiences to immediately start clamoring for more of the same.

Fortunately, the electronics industry soon came riding to the rescue.  Another electronic component that had been invented by this time was the "photocell".  A photocell would measure light intensity and produce a proportional electric signal.  Adding a photocell aimed at part of the film could turn how light or dark that part of film was into something that could be amplified and fed to speakers.

That solved the "theater" end of the process.  What about the other end?  Here the key component had already been invented.  A microphone could turn sound into a proportional electrical signal.  It was easy to turn this electrical signal into an equivalent pattern of light and dark on a part of the film.  Of course, electronic amplifiers (already invented) had to be added into the process at the appropriate points.

In the transition from silent to sound two changes were made to how film was put to use.  First, the film itself was sped up.   Instead of 24 frames per second, 32 frames per second are used in sound films.  Second, a small portion of the film got reserved for the "sound track".

By having the projector shine a bright light through a narrow slot in front of the sound track part of the film, and by then amplifying the result and feeding it to speakers in the movie theater, a talkie would get its "sound track" from the film itself.  A separate record was no longer necessary.

There was one little problem left.  The film must go through part of the projector in a herky-jerky fashion.  We move a picture in position, stop the film, open the shutter, leave it open for while, close it, then quickly move on to doing the same thing for the next picture in line.  The sound track, however, requires that the film move past the pickup slot at a constant speed.  The solution turned out to be simple.

An extra "loop" of film is put in the gap between the part of the projector that unspools film off of the feed reel. and the shutter/lens area.  Another extra "loop" of film is put between the shutter/lens area and the part of the projector that feeds the film to the take-up reel.  The sound pickup slot is located just after this second feed point.  At that point the film is moving at a smooth, even speed.

This "extra loops" design has the advantage that the piece of film that has to move fast then stop is short.  This makes it easier for that mechanism to operate at the necessary speed.  All that is necessary is to place the sound that goes with an image a few inches ahead of it on the film.

On the other end of the process, the sound is handled completely separately from the pictures.  A "sound" camera does not process sound.  That's why Hollywood has used something called a "slate" for years.  It has a flat area on it where the name of the film, the "scene" number and the "take" number are marked.  Waving the slate in front of the camera before the actual scene is filmed makes it easy for the "editor" to know where a piece of film is supposed to go in the finished picture.

But with the advent of sound an extra piece called the "clapper" was added.  The last thing the person waving the slate does before he pulls it out of frame is to "clap" the clapper.  The moving clapper piece is easy to see in the film.  The intentionally loud "clap" noise made by the clapper is easy to hear in the sound recording.  This makes it easy to "sync" sounds to the pictures they go with.

During the phonograph era of sound movies all too often there was a delay between when a person's lips moved and when the audience heard the words they were saying.  This was caused by the record getting out of sync with the film.  Moving the sound from the record to a sound track on the film combined with the clapper system eliminated this problem.  It's too bad this problem didn't stay fixed.  I will be revisiting the "sync problem" below.

By about 1930 almost all of the movies coming out of Hollywood included a sound track.  And it turns out that some "color" movies came out in the period before Hollywood made the transition to sound.  There were only a few of them because the technique used was fantastically difficult and expensive to pull off.

Film itself doesn't care what color the images it carries are.  You shine a bright light through the film and whatever isn't blocked out ends up on the screen.  If what passes through film that has some color in it then that color will appear on the screen.  If there is no color in the film then what appears on the screen will all be in shades of black and white.

To make these early color movies Artists hand painted the color onto the film print.  That meant that every frame of the film had to be colored by hand.  And each print had to separately go through this difficult and time consuming process.  It was done but not often.  More practical alternatives were eventually developed.

The first relatively practical color process was called "three strip technicolor".  In the camera a device split the picture into three identical copies.  Each copy went a different path.  One path ended on film that had goo on it that was only sensitive to red.  Another path ended on film featuring green goo.  Still another path ended on film featuring blue goo.

The reverse was done on the projection end.  The process was complicated and hard to pull off.  It was eventually replaced by a process that needed only a single piece of film.  The film had multiple layers of goo on it.  There was a red layer, a green layer, and a blue layer.

The process of shooting the film, processing the film, and making prints of the fill was difficult and expensive.  But nothing special was needed on the theater end.  They just ran the fancy film through their same old projector and a color picture appeared on the screen.

While all this was going on a separate effort was being made to replace all this film business with an all electronic system.  The decade of the '30s was consumed with making this all-electronic process work.  By the end of the decade limited success had been achieved.

Theoretically, the technology was already in place.  The photocell could act as a camera.  And a light bulb being fed a variable amount of voltage could stand in for the projector.  But neither were really practical.  You see, you'd need about 300,000 of each, one for each pixel.

The word "pixel" is now in common usage.  "Pixel" is shorthand for picture element.  If you divide a picture into rows and columns then, if you have enough of them, you can create a nice sharp picture by treating each separate point independently.  The first PC I owned had a monitor that had 480 rows, each consisting of 640 dots.  That means that the screen consisted of 307,200 pixels.

So with only 307,200 photocells and only 307,200 light bulbs a picture with a resolution similar to that of an early TV set could be duplicated.  And, of course, this would have to be done something like 24 to 32 times per second.  But that's not practical.  Something capable of standing in for those 307,200 photocells and those 307,200 lightbulbs would have to be found.  It tuned out that the lightbulb problem was the easier of the two to solve.

Start with a large "vacuum tube" (generic term for a lightbulb with lots of special electronic stuff jammed inside of it) with a flat front.  Coat the inside of the flat front with a phosphor, something that fluoresces when struck by a beam of electrons.  Add the components necessary for producing and steering an "electron beam" into the other end of the same vacuum tube.

Creating an electron beam turns out to be pretty easy.  Remember that the filament in a light bulb boils off electrons.  A custom filament can boil off a lot of electrons.  Electrons are electrically charged so they can be steered with magnets.

Connect the electron beam generating and beam steering components inside the vacuum tube to suitable electronics outside the vacuum tube but inside the TV set.  When fed suitable signals, they will steer the electron beam so that it can be made to repeatedly sweep across the screen in a series of lines.  The lines are made to sweep down the screen.  The intensity of the electron beam will also need to be precisely controlled.  And the whole process will have to be repeated many times per second.

The intensity of the electron beam is changed continuously in just the right way to "paint" an image on the flat part of the vacuum tube thirty times per second (in the U.S.)  This specialized vacuum tube came to be called a TV Picture Tube.  Add in some more electronic components, components to select only one TV "channel", pull the "video" and "audio" sub-signals out of the composite signal. etc., and you have a TV set circa 1955.

The other end is a variation on the same theme.  Again a vacuum tube with a flat front is used.  This time a special coating is used that is light sensitive.  As the electron beam sweeps across it, the coating is "read" to determine how much light has struck it recently.  More light results in more electrons residing at a specific spot.  These electrons are carefully bled off.  More light on a particular spot causes more electrons to bleed off when that spot is swept.

Making all this work was very hard.  But it was all working in time to be demonstrated at the 1939 New York World's fair.  The advent of World War II put a halt to rolling it all out for consumer use.  Efforts resumed immediately after the end of the War in 1945.

Initially, none of this worked very well.  But as time went by every component was improved.  The first TV standard to be set was the British one.  They based it on what was feasible in 1939.  So British TV pictures consisted of only 400 lines.   Pretty grainy.  The U.S. came next.  The U.S. standard was set in 1946.  U.S. TV pictures consisted of 525 lines.  The French (and the rest of Europe) came later.  They were able to set a 900 line standard.  So French TV pictures were much sharper than U.S. pictures.  And U.S. pictures were significantly sharper than British pictures.

But what about color?  The first attempt was based on the "three strip" idea that was originally used to make color movies.  It was developed by CBS.  They essentially threw the old black & white standard in the trash.  That allowed them to use the same idea of splitting the picture into three copies.  The red signal was extracted from the first copy, the green from the second, and the blue from the third.  On the other end the TV set would process each signal separately before finally combining them back together.

This system would have worked just fine if it had been adopted.  But it would have meant eventually replacing everything at both ends of the process.  And TV stations would have to broadcast separate black and white and color signals on separate frequencies until the old "black and white" TV set were a rarity.  Who knows?  Maybe we would have been better off if we had taken that route.  But we didn't.

But NBC was owned by RCA and RCA was the dominant player in the making and selling of TV sets, cameras, and the rest of the equipment needed to do TV.  If it could be done, they wanted to come up with a "compatible" way to do color.  They came up with a way to do it.

First, they found a way to sandwich additional information into the signal TV stations were broadcasting.  Critically, black and white TV sets would be blind to this additional information.  So, when a TV station started sending out this new signal, it looked just like the old signal to black and white TV sets.  They would keep working just as they always had.

But new Color TVs would be able to see and use this additional information.  The additional information consisted of two new sub-channels.  A complicated subtraction scheme is used that took the black and white signal as a starting point.  Color TVs were capable of performing the gymnastics necessary to put a color picture on the screen.

This probably made color TV sets more complicated than they would otherwise have needed to be had the CBS standard been used.  But by the mid '60s color TVs at a low enough price point for many consumers to manage became available.  And the "compatible" scheme allowed lots of people to stick with their old Black and White TVs well into the '70s.

At this time (mid '60s) RCA made NBC broadcast all of their prime time shows "in living color".  The other networks were soon forced to follow in short order.  The early sets delivered washed out color.  But it was COLOR so people put up with it.  By the mid '70s sets that delivered decent color were ubiquitous and cheap.  Unfortunately for RCA and the rest of the U.S. consumer electronics industry, many of these sets came from other countries.  Japan was in the forefront of this invasion.

Japan started out making "me too" products that duplicated the capabilities of products from U.S. manufacturers like RCA.  But they soon started moving ahead by innovating.  Japan, for instance, pioneered the consumer VCR market.  Betamax and VHS were incompatible VCR standards.  Both came out of Japan.  Betamax was generally regarded as superior but it was also more expensive.  VHS came to dominate the consumer market while Betamax came to dominate the professional market.

By this time the computer revolution was well underway and there was a push to go digital.  But the first successful digital product came out of left field.  Pinball machines had been popular tavern entertainment dating back at least to the '30s.  For a long time they were essentially electro-mechanical devices.  They were devoid of electronics.

But computers had made the transition from vacuum tube based technology to "solid state" (originally transistors, later integrated circuits) starting in about 1960.  By 1970 solid state electronics were cheap and widely available.  A company called Atari decided to do electronic pinball machines.

When making a big change is smart to start with something simple, then work your way up from there.  So an engineer named Allan Alcom was tasked to come up with a simple pinball-like device, but built using electronics.  He came up with Pong.  It consisted of a $75 black and white TV connected to a couple of thousand dollars worth of electronics.  Importantly, it had a coin slot, just like a pinball machine.

The Atari brass immediately recognized a hit.  They quickly rolled it out and revolutionized what we now call arcade games.  Arcade games started out in taverns.  You would put one or two quarters in and play.  The tavern arcade game business was small beer compared to what came after.  But grabbing a big chunk of that market was enough to make Atari into an overnight success.

And the technology quickly improved.  Higher resolution games were soon rolled out.  More complex games were soon rolled out.  Color and more elaborate sounds were soon added.  Soon the initial versions of games like Donkey Kong, Mario Brothers, Pac Man, and the like became available and quickly became hits.

The "quarters in slots in taverns" model soon expanded to include "quarters in slots in arcades", as arcades were open to minors.  But the big switch was still ahead.  The price of producing these game machines kept falling.  Eventually home game consoles costing less than $100 became available.  You hooked them up to your TV, bought some "game cartridges" and you were off to the races.  The per-machine profit was tiny compared to the per-machine profit of an arcade console.  But the massive volume more than made up the difference.

All this produced a great deal of interest in hooking electronics, especially digital electronics, up to analog TV sets.  This produced the "video card", a piece of specialized electronics that could bridge the differences between analog TV signals on the one side and digital computer/game electronics on the other.

In parallel with this was an interest in CGI, Computer Generated Images.  This interest was initially confined to Computer Science labs.  The amount of raw computer power necessary to do even a single quality CGI image was astounding.  And out of this interest by Computer Scientists came the founding in 1981 of a company called Silicon Graphics.  One of its founders was Jim Clark, a Stanford University Computer Science prof.

SGI, started out narrowly focused on using custom hardware to do CGI.  But it ended up being successful enough to put out an entire line of computers.  They could be applied to any "computer" problem, but they tended to be particularly good at problems involving the rendering of images.  I mention SGI only to indicate how much interest computer types had in this sort of thing.

Meanwhile, things were happening that did not have any apparent connection to computers.  In 1982 Sony rolled out the Audio CD, also known as the Digital Audio Compact Disc, or the CD.  This was a digital format for music.  And it was intended for the consumer market.  Initially, it did not seem to have any applicability to computers or computing.  That would subsequently change.

The CD was not the first attempt to go digital in a consumer product.  It was preceded by the Laserdisc, which came out in 1978.  Both consisted of record-like platters.  Both used lasers to process dots scribed in a shiny surface and protected by a clear plastic coating.  The Laserdisc used a 12" platter, roughly the size of an "LP" record.  The CD used a 4 3/4" platter, similar to but somewhat smaller than a "45" record as a "45" is 7" in diameter.

In each case the laser read the dots, which were interpreted as bits of information.  The bits were turned into a stereo audio signal (CD) or a TV signal complete with sound (Laserdisc).  The CD was a smash success right from the start.  The Laserdisc, not so much.

I have speculated elsewhere as to why the Laserdisc never really caught on, but I am going to skip over that.  I'll just say that I owned a Laserdisc player and was very happy with it.  Both of these devices processed data in digital form, but eventually converted it into an analog signal.  When first released, no one envisioned retaining the digital characteristic of the information or connecting either to a computer.  The CD format eventually saw extensive use in the computer regime.  The Laserdisc never did.

So, what's important for our story is that digital was "in the air".  Hollywood was also interested.  Special effects were very expensive to pull off.  The classic Star Trek TV show made extensive use of the film based special effects techniques available when it was shot in the late '60s.  But the cost of the effects was so high that NBC cancelled the show.  It was a moderate ratings success.  But the ratings were not high enough to justify the size of the special effects budget.

When George Lucas released Star Wars in 1972 little had changed.  He had to make due with film based special effects.  There are glaring shortcomings caused by the limitations of these techniques that are visible at several points in the film.  But you tend to not notice them because the film is exciting and they tend to fly by quickly.

But if you watch the original version carefully, and you are on the lookout, they stick out like sore thumbs.  He went back and fixed all of them in later reissues.  So, if you can't find one of the original consumer releases of the film, you will have no idea what I am talking about.

He made enough money on Star Wars to start doing something about it.  He founded ILM - Industrial Light and Magic, with the intent of making major improvements in the cost, difficulty, and quality of special effects.  ILM made major advances on many fronts.  One of them was CGI.

Ten years later a CGI heavy movie called Tron came out.  It was the state of the art in CGI when it was released.  Out of necessity, the movie adopted a "one step up from wire frame" look in most its many CGI rendered scenes.  The movie explained away this look by making its very primitivity a part of the plot.

Tron represented a big improvement over what had been possible even a few years before.  Still, in spite of the very unrealistic rendering style, those effects took a $20 million supercomputer the better part of a year to "render".  At the time, realistic looking CGI effects were not practical for scenes that lasted longer than a few seconds.

CGI algorithms would need to improve.  The amount of computing power available would also have to increase by a lot.  But technology marches on and both things eventually happened.  One thing that made this possible was "pipeline processing".  The Tron special effects were done by a single computer.  Sure, it was a supercomputer that cost $20 million.  But it was still only one computer.

Computer Scientists, and eventually everybody involved, figured out how to "pipe" the output of one computer to become the input into another computer.  This allowed the complete CGI rendering of a frame to be broken down into multiple "passes".  Each pass did something different.  Multiple computers could be working on different passes for different frames at the same time.

If things could be broken down into enough steps, each one of which was fairly simple to do, then supercomputers could be abandoned in favor of regular computers.  All you had to do was hook a bunch of regular computers together, something people knew how to do.  The price of regular computers was plunging while their power was increasing.  You could buy a lot of regular computers for $20 million, for instance.  The effect was to speed the rate at which CGI improved tremendously.

A particularly good demonstration of how fast CGI improved was a TV show called Babylon 5.  It ran for five seasons that aired from 1993 to 1998.  The show used a lot of CGI.  And it had to be made on a TV budget, not a movie budget.  Nevertheless, the results were remarkable.

The season 1 CGI looks like arcade game quality.  That's about what you would expect from a TV sized CGI budget.  The images are just not very sharp.  But year by year the CGI got better and better.  By the time the last season was shot the CGI looked every bit as crisp and clear as the live action material.  The quality of CGI you could buy for a fixed amount of money had improved spectacularly in that short period.

So, that's what was happening on the movie/TV front.  But remember SGI and the whole Computer thing?   As noted above, the first home computer I owned used a "monitor" whose screen resolution was only a little better than a black and white TV.  Specifically, it had a black and white (actually a green and white, but still monochromatic) screen.  The resolution was 640x480x2.  That means 640 pixels per line, 480 lines, and 2 bits of intensity information.

PCs of a few years later had resolutions of 800x600x8.  That's 800 pixels per line, 600 lines, and 8 bits of resolution.  A clever scheme was used to allow this "8 bit resolution" to support a considerable amount of color.  For reference, a modern PC has a resolution of 1920x1280x24.  That's 1920 pixels per line, 1280 lines, and 24 bits of resolution.  Typically, 8 bits of resolution are used to set the red level to one of 256 values.  The same 8 bit scheme is also used for green and for blue.  That's comparable in picture quality to a good "HD" TV.  But back to our timeline.

The video capabilities of PCs increased rapidly as the '80s advanced.  Their capabilities soon easily surpassed the picture quality of a standard TV.  And SGI and others were rapidly advancing the state of the art when it came to CGI.  The later installments of the Star Wars films started using more and more CGI.  Custom "Avid" computers became available.  They were built from the ground up to do CGI.  

Meanwhile, custom add in "graphics" cards started to appear in high end PCs.  By this time games had leapt from custom consoles to the mainstream PC market.  And gamers were willing to spend money to get an edge.  As one graphics card maker has it, "frames win games".  If your graphics card can churn out more sharp, clear frames per second, then you will gain an advantage in a "shoot 'em up" type game.

These graphics cards soon went the SGI route.  They used custom "graphics processor" chips that were optimized for doing CGI.  And, as is typical of solid state electronic devices, they started out expensive.  Top of the line graphics cards are still quite expensive.  But they deliver spectacular performance improvements.  On the other hand, a decent graphics card can now be had for $50.

And, in another call back to SGI, which is now out of business, some supercomputers are now being built using graphics processor chips instead of standard "general purpose" processor chips.  Supercomputers built around graphics chips are not as fast as supercomputers made using general purpose chips.  But they are still damn fast, and they are significantly cheaper.

All these lines of development converged to produce the DVR.  TiVo brought out one of the first successful DVRs built for the consumer market in 1999.  It was capable of processing a TV signal as input.  It even had a "channel selector" like a regular TV.  It was also capable of outputting a standard TV signal.  What was in the middle?  A standard PC disk drive.  The TiVo translated everything to and from strings of bytes, which could be stored on disk.

The TiVo was a big improvement over a VCR.  A "guide" listing every showing of every episode of every show got updated daily.  This was possible because it had a standard PC style processor chip built into it.  All this made possible commands like "Record Jeopardy!".

It could also record one thing while you watched something else.  And you could watch shows you recorded in a different order than you had recorded them in.  And you could stop the show then restart it later without missing anything if the phone rang or someone came to the door.  And you could fast forward through the commercials.

Subsequent models permitted multiple shows to be recorded at once, even though they were being broadcast on separate channels.  Other features were added.  But the point is that, with the advent of the TiVo DVR, anything that could be done with analog TV equipment could now be done with hybrid analog/digital computer based equipment.

Leave that aside for the moment so that we can return to movies.  Recall that in 1972 an effects heavy movie like the original Star Wars was made without recourse to CGI.  But thanks to ILM and others, advances were starting to be made.  By 1982 a movie like Tron could be made.  What came later?  I am going to use the work of James Cameron as a roadmap.

Cameron was a brilliant artist who also understood technology thoroughly.  As a result, The Abyss, a movie released in 1986, only five years after Tron, showcases a spectacular CGI feat.  It included a short scene featuring a large worm-shaped alien.  The alien appeared to be a tube made entirely of clear water.

You could see through it to a considerable extent.  And bright things that were near it could be seen partially reflected in its surface.  And did I mention that the alien moved in an entirely realistic manner.  The alien was completely believable at all times.  The sophistication necessary to achieve this was beyond anything ever seen before.

The requirement for both translucency and reflectivity required much more computation per frame.  That's why he had to keep the scene short.  If he hadn't, the time necessary to make all those computations would have been measured in years.  As it was, it took months and a blockbuster sized movie budget to pull it off.

Five years later he was able to up the ante considerably.  Terminator II (1991) made extensive use of  what appeared to be a completely different CGI effect.  When damaged, which turned out to be a lot of the time, the bad guy had a highly reflective silver skin.  In his silver skin form he was expected to run, fight, and do other active things.  And he had to move like a normal human while doing them.

The necessary computer techniques, however, were actually quite similar to those used for his earlier water alien effect.  Fortunately, by the time Cameron made Terminator II, he was able to create a CGI character who could rack up a considerable amount of screen time.  And he could do it while staying within the normal budget for a blockbuster, and while hewing to a production schedule typical for a movie of that type.  

The CGI infrastructure had gotten that much better in the interim.  And it continued to get better.  He wanted to make a movie about the sinking of the Titanic.  Previous movies about the Titanic (or any other situation where a real ship couldn't be used) had always used a model ship in a pool.  Cameron decided to use a CGI version of the ship for all the "model ship in a pool" shots.  Nowhere in Titanic (1997) are there any shots of a model ship in a pool.

It turned out to be extremely hard to make the CGI version of the ship look realistic enough.  The production ran wildly over budget.  The production schedule slipped repeatedly.  It seemed for a while like the movie would never get finished.   But, in the end it didn't matter.  Titanic was eventually finished and released.  It was wildly popular, so popular that it pulled in unbelievable amounts of money at the box office.

That experience ended up giving Cameron essentially Carte Blanche.  He used that Carte Blanche to create Avatar in 2009.  Again, making the movie cost fantastic amounts of money, most of which went to creating the CGI effects.  It was released in 3D and Imax.  Realistic visuals that stood up under those conditions were seemingly impossible to pull off.  But he did it.  And the movie was even more successful than Titanic.   It too earned more than enough money to pay back all of its fantastically high production cost.

But Titanic and Avatar were in a class by themselves due to their cost.  What about a movie with a large but not unlimited budget?  What did CGI make it possible to do in that kind of movie?  Two movies that came out within a year of each other answered the question.

The movies were What Dreams May Come (1998) and The Matrix (1999).  Both had large but not Cameron-esque budgets.  Regardless, both made heavy use of CGI.  But the two movies used CGI in very different ways.  Creative and unorthodox in each case, but very different.  Both movies affected their audiences strongly, but also in very different ways.

I saw both of them when they first came out.  After seeing them the conclusion I drew was that, if someone could dream something up, and then find the money (enough to fund an expensive but not super-expensive movie), then CGI was now capable of putting that something into a movie, pretty much no matter what it was.

And CGI has continued to get better, especially when it comes to cost.  Now movies and TV shows that make extensive use of CGI are a dime a dozen.  In fact, it is now cheaper to shoot a movie or TV show digitally than it is to use film.  This is true even it it has little or no need for CGI.

It is shot using using high resolution digital cameras.  Editing and other post processing steps are done using digital tools.  It is then distributed digitally and shown in theaters on digital projectors or at home on digital TV sets (or computers or tablets or phones).  By going digital end-to-end the project is cheaper than it would have been had it been done using film.

Does that mean that there is nowhere else for the digital revolution to go?  Almost.  I can think of one peculiar situation that has arisen as CGI and digital have continued to get cheaper and cheaper, and better and better.

It had to do with the making of the movie Interstellar in 2014.  You see, by that point Hollywood special effects houses had easy access to more computing power than did a well connected and well respected theoretical physicist, somebody like Kip Thorne.

Thorne was so well thought of in both scientific and political circles that he had almost singlehandedly talked Congress into funding the LIGO project, the project that discovered Gravity Waves.  LIGO burned through over a billion dollars before it discovered its first set of Gravity Waves.  Congress went along with multiple funding requests spanning more than a decade based on their faith in Thorne.

Thorne's specialty was Black Holes.  But no one knew what a Black Hole really looked like.  The amount of computations necessary to realistically model one was a giant number.  The cost of that much computation was beyond the amount of grant money Thorne could get at one time.  And nobody else had any better luck getting approval to spend that much money, at least not to model a Black Hole.

But his work as a consultant on Interstellar granted him entrée to Hollywood special effects houses (and a blockbuster movie sized budget to spend with them).  The effects houses were able to run necessary computations and to use CGI to turn the results into video.

Sure, the ostensible reason for running the calculations was for the movie.  And the videos that were created were used in the movie, so everything was on the up and up.  But the same calculations (and video clips) could and did serve the secondary purpose of providing answers to some heretofore unanswerable serious scientific questions.  The work was serious enough that Thorne had no trouble getting it published in a prestigious scientific journal.

So we have now seen how movie production and TV production went digital.  That only leaves broadcast television.  The change was kicked off by consumer interest in large format TV sets.  Practicalities limit the size of a picture tube to around 30".  Even that size is hard to produce and very heavy.  Keeping a vacuum of that size under control requires strong, thick walls.  That makes them heavy.  The solution was a change in technology.

Texas Instruments pioneered a technology that made "projection TV" possible.  It soon reached the consumer market.  Front projection units worked not unlike a movie projector.  They threw an image onto a screen.  Front projection TVs just substituted a large piece of electronics for the movie projector.

Rear projection units fit the projector and the screen into a single box by using a cleaver mirror arrangement.  Rear projection systems could feature a screen size of up to about 60".  Front projection systems could make use of a substantially larger screen.

The color LCD - Liquid Crystal Display screen came along at about the same time.  Color LCD TVs became available in the late '80s.  Initially, they were based on the LCD technology used in laptop computers, so the screens were small.  But, as time went by affordable screens grew and grew in size.

But the important thing for our story, however, is that both technologies made it hard to ignore the fact that a TV image wasn't very sharp and clear.  And the NTSC standard that controlled broadcast TV made it impossible to improve the situation.

It was time to move on to a new standard that improved upon NTSC.  The obvious direction to move in was toward the PC.  With no NTSC standard inhibiting them the image quality of PCs had been getting better and better right along.  And the PC business provided a technology base that could be built upon.  The first serious move was made by the Japanese.

In 1994 they rolled out a "digital high definition" system that was designed as the successor to NTSC and other TV standards in use around the world at that time.  This scared the shit out of American consumer electronics companies.

By this time their market share had shrunk and they were no longer seen as leading edge players.  They operated a full court press in D.C.  As a result, the Japanese system was blocked for a time so that a U.S. alternative could be developed.  This new U.S. standard was supposed to give the U.S. consumer electronics companies a fresh chance to get back in the game.

U.S. electronics companies succeeded in developing such a standard.  It was the one that was eventually adopted the world over.  But they failed to improve their standing in the marketplace.  The Japanese (and other foreign players) had no trouble churning out TVs and other consumer electronics that conformed to the new standard.  The market share of U.S. consumer electronics companies never recovered.

That standard was, of course, SD/HD.  Actually, it wasn't a single standard.  It was a suite of standards.  SD - Standard definition was a digital standard that produced roughly the same image quality as the old U.S. NTSC standard.  HD - High Definition produced a substantially improved image.  Instead of the roughly 600x400 lines of NTSC and SD,  the HD standard called for 1920x1080.

And even this "two standards" view was an oversimplification.  HD was not a single standard.  It was a family of related sub-standards.  There was a low "720p" 1280x720 sub-standard, a medium "1080i" 1920x1080 (but not really - see below) sub-standard, and a high "1080p" 1920x1080 sub-standard.

The 1080i sub-standard used a trick that NTSC had pioneered.  (Not surprisingly, the TV people demanded that it be included.)  Even lines were sent during one refresh and odd lines were sent on the next refresh.  That means that only a 1920x540 per screen resolution was needed for each screen refresh.  NTSC had actually sent only about 263 lines per screen refresh.  It used the same even lines then odd lines trick to deliver 525 lines by combining successive screens.

The 1080p "progressive" sub-standard progressively delivered all of the lines with each screen refresh.  That's how computers had been doing things for a long time by this point.  And this "multiple sub-standard within the full standard' idea turned out to be important.  It allowed new sub-standards to be added later.  Since then a "4K" (3840x2160 - 4 times the data but it would have been more accurate to call it "2K") and an "8K" (7680x4320) sub-standard have been added.

The original Japanese specification would have required the bandwidth dedicated to each TV channel to be doubled.  But the U.S. standard included digital compression. Compression allowed the new signal to fit into the same sized channel as the old NTSC standard had used.  

There is a lot of redundant information in a typical TV picture.  Blobs of the picture are all the same color.  Subsequent images are little changed from the previous one.  The compression algorithm takes advantage of this redundancy to throw about half the bits away without losing anything important.  The computing power necessary to decompress the signal and reproduce the original HD picture was cheap enough to be incorporated in a new TV without adding significantly to its price.

The first commercial broadcast in the U.S. that used the new 1080i HD specification took place in 1996.  U.S. TV stations stopped broadcasting the old NTSC signal in 2011.  Adapters could be used that down converted HD signals into NTSC.  But few people bothered.  It was easier to just replace their old NTSC capable TV with a new cheap HD capable TV.

The widespread and rapid acceptance of HD resulted in an unexpected convergence.  A connector cable specification called HDMI came into wide use in the 2003-2005 time frame.  It was ideal for use with HD TV sets.  And the 1080p HD standard turned out to work well for computer monitors.

As a result, HDMI cables have become the cable of choice for both computer and TV applications.  HDMI cables rated to handle TV signals at "4K" resolution, or even "8K" resolution, are widely available.  They are well suited for use with even the most ultra-high resolution computer monitor.

It took a while, but we are all digital now.  Unfortunately this brought an old problem back.  In the digital world we now live in, the picture and the sound are back to taking different paths.  If everybody along the way is careful then everything is fine.  But all too often the sound and the picture get out of sync.

It most often happens on a live show where one or more people are talking from home.  Zoom, or whatever they use, lets the sound get out of sync with the picture.  If the segment is prerecorded this problem can be "fixed in post".  That can't be done if it is a live feed.  And, even if it can be fixed in post, all too often nobody bothers to do so.

I find it quite annoying.  But lots of people don't even seem to even notice.  Sigh!

Monday, June 14, 2021

Global Warming has Won

I had my first hands-on experience with a computer in 1966.   That's when I wrote my first computer program.  About a decade later I made the then bold prediction that "computers will take over the world".  And about a decade after that I opined that "They won.   Computers have succeeded in taking over the world".  Did I mean that some kind of Skynet AI would be running everything and people would merely be appendages.   No!  My definition of "taking over" was much more modest.

What I meant was that computers would be so deeply imbedded into society in ways so pivotal and important that society could no longer function without them.   At that time a case could perhaps be made that my pronouncement was a bit premature.

But people now organizing all aspects of their lives using a smartphone, a powerful computer wrapped in an attractive package.  And much else in our lives now depends on cheap, powerful, and ubiquitous computing happening behind the scenes.  So, all doubt as to the truth of my pronouncement is long gone.

In a parallel vein, experts have now been worrying about Global Warming taking over for several decades.  As with computers, for a long time it was some sort of vague concern about the future.  Bad things were definitely going to start to happen at some point.  It is my contention that we have now reached that "some point".  The future is now.  And that means that Global Warming has, in fact, won.

There is a case to be made that I am premature in making this pronouncement.  In fact, the case is stronger for my Global Warming pronouncement being premature than it was with respect to my pronouncement about computers taking over.  Nevertheless, I think I got my computer call right back then.  And I think I am right in making my Global Warming pronouncement now rather than later.

Back then scientists and other experts were little concerned about whether or not computers were or were not taking over.  As a result, they spent little time considering the question and even less time debating a time line.  The same can not be said about Global Warming.  Scientists have been very concerned about the issue from the start.

And they have spent a lot of time and effort trying to estimate when various milestones would take place.  And the consensus position from the start has been that critical milestones will be reached "soon but not yet".  Since that initial estimate, when we will hit "soon" is has continued to shrink.  "Soon" is now about a decade.

Bill Gates recently published a book on the subject of Global Warming.  He does a nice job of laying out the consensus position on how big the problem is.  I reviewed the book favorably in a blog post.  (See:  Sigma 5: How to Avoid a Climate Disaster.)  In the book Gates lays out a timeline for what needs to be done and when.  If the timeline is followed than Gates opines that a climate disaster will be avoided.

If the Gates plan is followed then all will be well by 2050.  In my review I opined that we would be lucky to have everything he lays out implemented by 2100.  And Gates tells us that there are things that we need to be doing right now.  But we aren't doing them.  Worse yet, I don't see any reason to believe that we will start doing them any time soon.

My thinking has tended to track that of scientists and experts, people like Gates.  But I have now changed my mind.  I did so abruptly.  It was as a result of reading the June 8, 2021 edition of the Seattle Times.

It contained a number of disturbing stories, all pointing in the same direction.  The stories concerned events that took place far from my home town.  So the Seattle Times was not the originator but merely the purveyor of this information.  And the information was truly scary.

One story concerned specific instances of prolonged periods of bad weather happening in specific places.  The events described were very scary.  But what made it even more scary was that they fit into a pattern that has been becoming more and more pronounced.  Let's start with the events described in the story.

The story reported on recent extremely high daytime temperatures in the parts of Middle East and in Central Asia.  These temperatures often exceeded 125 degrees Fahrenheit.  That's considered dangerously hot even for traditionally hot and dry places.  The Middle East is known for being hot.  But there's hot and there's HOT.

There have always been places where it gets extremely hot.  But places known for their extreme heat were desolate places where nobody lives.  Death Valley is an example.  No one lives there.  People don't live in extremely hot places like Death Valley because people can't survive prolonged stays in extremely hot places.

These temperatures were recorded in populated areas, areas known for their heat but not for their extreme heat.  But the places in the story that are seeing record breaking temperatures are places that host medium to large populations.  Not an oasis but a city or a metropolis.  The fact that such large populations have been living there for a long time indicates that they didn't used to feature such extreme temperatures.

One thing I learned from the story is that there is something called the 50 degree club.  50 degrees Celsius equates to 122 degrees Fahrenheit.  Most countries never manage a high enough temperature to qualify for membership in the 50 degree club.  Five Middle East countries joined the club in the past few days.  And in several of them, it wasn't even close.

A few days ago a town in Abu Dhabi hit 51 Celsius (123.8 Fahrenheit).  A town in Kuwait hit 50.9 (123.6).  A town on an island in Oman, where surrounding water should have moderated things, hit 50.1 (122.2).  So did a small town in Pakistan, a country that isn't even in the Middle East.  It's in Central Asia, normally a hot but not super-hot region.  A day later a town in the UAE hit 51.7 (125.2).  But wait, there's more.

As the altitude goes up the temperature goes down.  That means cities situated at altitude are cooler, usually much cooler, than otherwise similar cities located near sea level.  In spite of this, a town in Iran managed to hit 45.5 (113.9).  Not super-hot until you learn that this town is situated at an altitude of over 3,000 feet.  A town at altitude in Turkmenistan hit 44.7 (116).  A town at altitude in Uzbekistan hit 44.4 (112.5).  A balloon launched in Abu Dhabi had to ascend to 5,000 feet before the temperature dropped below 90 degrees Fahrenheit.

These parts of the world are known for being hot.  But this is extreme heat.  Temperatures are running about 15 degrees Fahrenheit above normal.  That's a lot.  Closer to home, over the last few days we have seen temperatures ten or more degrees above normal in Wisconsin, Minnesota, North Dakota, and South Dakota.  Very high temperatures in late July or early August?  That would be expected.  Extremely high temperatures in early June and in multiple different parts of the world?  Something is going on.

If this was all that has been going on I would not be that worried.  But there's also this.

The most talked about driver of Global Warming is Carbon Dioxide.  The "Greenhouse Effect" that makes Carbon Dioxide important was well understood two hundred years ago.  With that in mind, C. David Keeling of the Scripps Institute of Oceanography set out to try to understand what was normal when it came to the level of Carbon Dioxide in the atmosphere.

The proper way to do this is to take a whole lot of measurements.  You want to measure it in a lot of places.  And you want to keep measuring it over and over so that you can see how it changes over time.  But at the time (1958) measurements of that kind were hard to do.  If he was going to be able to do it at all he was going to have to figure out the minimum he could do that would be informative.

He decided that taking the measurement in just one place, if you picked the right place, would have to do.  He looked around and decided an astronomical observatory on Mauna Loa mountain in Hawaii was it.  There he could measure pure ocean air far from industry or big cities.

And while taking measurements in one place might do, he had to repeat the measurement on a strict schedule.  He could manage a twice per day measurement, so that's what he did.  Those twice per day measurements from the Mauna Loa observatory continue to this day.  Others have started similar programs at various locations scattered around the world.  And we now have satellites gathering additional data.

Why did he do it?  It was not because he knew what he would find.  It was because no one knew what he would find.  No one had ever tried to do it before.  And there were literally no well established theories capable of predicting what the measurements would show.  It only took him a couple of years to learn something interesting.

The level of Carbon Dioxide varies with the season.  It is always high at the same time of year (May) and always low at the same time of year (roughly six months later).  It didn't take long to figure out that the pattern correlated with industrial activity in the Northern hemisphere.  The correlation was obvious in retrospect.  But, absent any data, other correlations were plausible.

It took longer for another pattern to emerge.  The highs kept getting higher.  The lows kept getting higher too.  That was surprising.  Scientists knew that industrial activity threw Carbon Dioxide into the air.  But they also knew that plants pulled it out of the air and turned it into plant matter.  It was entirely possible that the two process would balance each other and the average amount of Carbon Dioxide would stay roughly the same.

Initially this was an interesting but not concerning development.  Maybe there was a process that played out over a longer time period.  Sun spots, for instance, follow an 11 year cycle.  Maybe there was something like that going on.

But year after year the trend was up, always up.  Various possible mechanisms that would drive levels back down were shown to either not exist or to not be powerful enough to overcome whatever was pushing the levels up.

As the "upward, ever upward" pattern got more pronounced scientists started to worry.  It became more and more important to try to get a much more detailed understanding of what was going on.  So they delved into every possible mechanism for increasing or decreasing the amount of Carbon Dioxide in the atmosphere.

They learned a lot.  A big volcanic eruption throws a lot of Carbon Dioxide into the air.  Various geological processes pull Carbon Dioxide out of the air.  But eruptions cause a spike that only lasts a year or so.  And geologic activity takes thousands of years to make a difference.  As the details got filled in and understanding deepened concern increased.  The result was the IPCC.

The Intergovernmental Panel on Climate Change was founded in 1984 by the World Meteorological Organization, an agency of the U.N.  Its job was to consolidate and organize all the scattered efforts to understand climate and climate change.  Thousands of scientists contribute.  The IPCC periodically issues a report summarizing the current state of the art.

Five "assessments" have been issued.  The next one is due next year.  The IPCC itself does no research.  It collects, evaluates, collates, and consolidates the research of others.  Every report includes questions that need more work to answer and areas that are poorly understood.  But with each report the level of understanding keeps improving.

Old questions keep getting replaced by new ones as the understanding deepens.  The new the questions tend to be more focused.  A general understanding leads to questions relating to one or another specific area that is still not well understood.  An understanding of larger effects leads to questions about smaller effects.

The picture keeps getting clearer and more detailed.  But the improved clarity and detail leaves the overall situation unchanged.  The environment is warming up.  It is warming at an increasing rate.  This warming is causing greater and greater disruptions.  More contentious, but only outside scientific circles, is that the evidence that the largest driver of this warming trend is man made activity.

That last conclusion gores many oxen.  And many of these oxen are wealthy and powerful.  They stand to lose a lot if large mitigation measures are undertaken.  They have been pushing back, hard, since before the first IPCC report was issued in 1990.  Since the science and the data are solid they tend to depend heavily on FUD:  Fear, Uncertainty, and Doubt.

With that background in place, let me turn to the other story that caught my eye in the newspaper that day.  It reported on record high levels of Carbon Dioxide being found in the latest measurements.  Given the upward trend that has now been on display for more than 60 years, that would normally be seen as disappointing but not surprising.  But context is everything.

We are just now emerging from the economic disruption caused by COVID-19.  For more than a year people have stayed home.  They also bought less, drove less, flew less.  As a result, business dried up.  Manufacturers cut production back drastically.  Shipping goods became much more difficult, especially if a border crossing was involved.  The usual cast of characters responsible for Carbon Dioxide emissions all took big hits.

As a result, Carbon Dioxide emissions fell by 5.8% in 2020.  That should have resulted on good news on the Global Warming front.  But Carbon Dioxide hit 419 PPM in May of 2021.   That's a new record.  It's only 2 PPM higher than it was a year ago.  But, given  how much COVID decreased economic activity one would think it would have gone down significantly.  The news gets worse.

The currently agreed upon target of Global Warming mitigation plans is a 7.6% reduction every year from 2020 to 2030.  If achieved, this is supposed to cause the global temperature to increase by 2.7 degrees Fahrenheit (1.5 degrees Celsius) or less.  Swinging a COVID wrecking ball at the economy only caused emissions to decrease by 5.8%.  And the decrease was so transient that Carbon Dioxide actually increased from May of 2020 to May of 2021.

This is the clearest evidence possible of just how hard it will be to stop, or even substantially slow the rate of increase of Carbon Dioxide in the atmosphere.  Couple this with the fact that there is still little will, political or otherwise, to implement the draconian measures that would be necessary.  It's just not going to happen.

That means that the "how bad is it right now" question is critically important.  I noted a story from one newspaper that was published about a week ago.  I ended up putting the writing of this post on hold for several days.  During that period story after story after story of extreme bad weather hit the news.  Bad weather events are getting almost as common as mass shooting events.  Each type of event has now become so common that only the worst events even make the news now.

It has gotten to the point that only those who go out of their way to deny or avoid the increasingly obvious signs can remain Climate Deniers.  Mother Nature is literally rubbing our faces in Global Warming on a daily basis now.  If it's not droughts, its wild fires and dust storms.  Or it is the flip side, torrential rains, floods, and the like.  The weather has changed to the point where places that were hospitable to occupation are no longer livable.

Right now, only a few places have seen conditions change to that extent.  But you would have been hard pressed to find any such places even as little as ten years ago.  Things are still manageable.  But only if things don't continue to get worse.  But getting worse is already baked into the cake.

One of the things we have learned since the first IPCC report is just how critical the oceans are.  When scientists first started studying the life cycle of Carbon Dioxide they quickly discovered how little they knew.  They could roughly estimate how much Carbon Dioxide human activity was throwing into the air.  And they had a reasonable idea of where about half of it ended up.  But where the other half went was then a complete mystery.

They now know that the missing half goes into the ocean.  And it turns the ocean acidic.  The ocean is ginormous.  But there is enough Carbon Dioxide involved that the effect is easy to measure.  And the effect is bad.  Lots of sea creatures depend on shells or bones or other hard substances.  Acidic water damages those hard substances and makes them harder to create in the first place.

The negative impact this increased level of acidity has already been enough to be noticeable.  It's bad now but not yet really bad.  Something that is already much worse is heat.  The surface of the earth is already noticeably warmer.  And that has translated into the surface of the ocean getting noticeably warmer.

The most spectacular result of this is more strong (category 4 and category 5) hurricanes.  The magic number is 22 degrees Celsius.  If the water a hurricane is traveling over is 22 degrees or warmer the hurricane quickly increases in intensity.  We now see hurricanes going from category 2 to category 4 in less than 24 hours.  No one thought that was even possible a decade ago.  Now meteorologists have seen it happen several times.

The ocean is full of currents.  They cause surface water to "downwell" into deeper parts of the ocean and deep water to "upwell" to the surface.  It turns out that corals are very sensitive to temperature.  Downwelling warm water has heated the water that surrounds coral reefs.  That has caused "bleaching" events around the world.  A coral's colors come from tiny creatures that live on the surface of the coral skeleton.  Warm water causes these creatures to flee leaving the white skeleton material behind.

It turns out that there is an effect that is even more perverse than coral bleaching.  Ocean water circulates.  It downwells here.  It upwells there.  Over time the temperature of the surface water gets propagated to deeper and deeper parts of the ocean.  Until recently all the upwelling water was old, cold water.  That tended to keep the earth's surface cooler.  But not anymore.

Global arming has been going on long enough that the upwelling water is now warmer than it used to be.  And that means that the oceans no longer cool the surface as much as they used to.  There is a general term for what is going on here.  It's called inertia.

Lots of additional warming is baked in by this inertia.  It will happen even if we wave a magic wand and cause the level of Carbon Dioxide in the air to instantaneously drop all the way back to per-industrial levels and stay there.

In terms of the weather, we are living our best life right now.  All those storms, all those droughts and floods, all those high temperature records.  They constitute our best life.  Going forward, it's going to keep getting worse no matter what we do.

No one has done the computation or modeling to estimate how much worse will it get if we fix things immediately.  But there is no doubt that things will get considerably worse.  And it will keep getting worse for 40-200 years.  That's how long it would take for the bad things that are already built in but haven't happened yet to work themselves out.

But there is no magic wand.  We are not going to immediately return Carbon Dioxide levels to per-industrial levels.  There are powerful and well established groups that know how to fight most of the activities Gates outlined in his book.  The political will to oppose these powerful groups is building.  But they are only powerful enough to win a skirmish here and a skirmish there.

A court ordered Royal Dutch Shell to drastically reduce greenhouse emissions.  Exxon-Mobil, the giant oil company, now has three "green" people on its board.  But the Shell court case might get overturned or cut back drastically by an appellate court.  And the Exxon-Mobil board members are far from a majority.

There are rays of hope to be found here and there.  But these are candles of hope when what we need are search lights.  Substantial progress is years away.  And that means we need to shift our focus.  Most current effort is focused on avoidance.  Gates book tells us how to avoid a climate catastrophe.  It tells us little about how to live with one.

It is time to shift our attention to mitigation.  If it is possible, we should do both.  But the Global Warming catastrophe has only been slowed by most of our efforts being devoted to avoidance.  Fortunately, mother nature is weighing in more and more loudly and clearly each day.

Ten years ago it was relatively easy to avoid noticing any of the effects of Global Warming.  That strategy is now on the ropes.  Mother Nature is already making the case that Global Warming is real, and that it is bad.  As time goes by that case will only get stronger and harder to ignore.  The need for people to articulate the case will diminish.  But Mother Nature lacks the ability to provide direction on the subject of mitigation.

I, however, have posted on the subject of mitigation before.  I did it all the way back in 2018.  Here's the link:  Sigma 5: Global Warming.  Mitigation is possible.  That's good because we have no choice.  We are going to be forced to find a way to learn to live with the effects of Global Warming.  After all, Global Warming has won.

Sunday, May 16, 2021

Israel: Chapter 2021

 The Israelis and the Palestinians are at it again.  It is only a slight exaggeration to say that more than 2,000 chapters could be written about the various conflicts that have erupted in the area.  Israel is in the area that serves as the land connection between Europe, Asia, and Africa.  A lot of people have gotten into a lot of fights trying to move to or through the area.

As a result, the region has a storied history.  How storied?  As is commonly reckoned Seattle, the city I live in, has a history that dates back all the way to about 1850 AD.  (This ignores the history of the various peoples that were there when the White man arrived, but cut me some slack here.  I'm drawing a contrast.)  History in the area in and around Israel dates back to something like 1,000 BC.

And, if we went all the way back to this time, we would find here pretty much what we would find in most of the rest of the world.  Tribes wandered around.  One tribe would fight with another.  Sometimes the invading tribe would completely displace the people occupying the land.  Or the occupying tribe would succeed in fighting the invaders off.  Or something in-between would happen and they would end up intermingling.  In those days possession was not nine tenths of the law.  It was the entirety of the law.  Nobody back then worried about legal niceties.

At some point the Israeli tribe, a tribe that adhered to the Jewish Religion, moved into what was then called Palestine.  But then they got kicked out.  But then they came back.  This is all discussed in the Bible.  During the New Testament era, an era that started around 33 AD, the area was governed by Rome but the bulk of the population consisted of Israelis who were not Roman citizens.

But that didn't last.  In 68 AD the Romans got mad at the Jews, destroyed their main temple, and kicked them out of Jerusalem.  The eventual result was the "diaspora", the exodus of Jews from Palestine and to small communities scattered all over the world.  It took a long time but eventually there were few Jews left anywhere in Palestine.

Then along came the Crusades.  They had nothing to do with the Jews.  The idea was to wrest control of Palestine, or at least the "Holy Places", from the Arab people who lived there, and give that control to Christians.  In other words, the plan was for the official religion of Palestine to shift from the Islamism to Christianity without coming anywhere near Judaism in the process.  The Crusades eventually failed.  And Jews failed to return to Palestine.  The exodus continued unabated.

That brings us to the twentieth century.  At the beginning of the century Arabs were in firm control of Palestine, and the Jewish population was still miniscule, at best.  But Pogroms (see the musical "Fidler on the Roof" for some insight into how Jews were treated in Russia at the time) and other forms of oppression got Jews to thinking that they should reclaim their ancestral homeland so they would have some place to retreat to.

"Next year in Jerusalem" was a popular Jewish lament at the time.  In spite of this, little was done.  Then World War I happened.  And the Ottoman Empire, the political entity that controlled Palestine and much else, was on the losing side.  After the War ended, the winners happily carved up all of the land the Empire had controlled and parceled it out to themselves as spoils of war.

Great Britain was on the winning side.  Among other things, it got "The Palestine Mandate".  That means that they got to set the rules in Palestine.  They saw no reason to make it easy for Jews to immigrate.  So, while they made it possible for Jews to move to Palestine, they didn't make it easy.  Among other things, they didn't want to stir the locals up too much.  And the locals were very hostile to Jewish immigration.

So, by the start of World War II there is a significant Jewish population in Palestine.  But they still constituted a relatively small minority.  Even so, for the first time in about 1,500 years, there is a significant Jewish population living in Palestine.  And, of course, before and during the War the Nazis set out to exterminate all European Jews.  They came shockingly close.

After the War Jews the world round got serious about turning Palestine into Israel, a Jewish State.  (See the book and movie "Exodus" for one of many tactics they used during this period.)  The Palestinians were very unhappy with the Brits about all this.  But the Brits and other European countries were embarrassed by the whole Nazi "Final: Solution" business.  Ultimately the British decided the area was more trouble than it was worth.  They turned control over to the Jews and, for the most part, washed their hands of the whole thing.

The Arab countries surrounding Israel were very unhappy about this.  They started a War with the objective of wiping out the Israeli forces before they even got a chance to get themselves organized.  They would then change the name from Israel back to Palestine and return control to the Arab population.  In a surprising turn of events the small Jewish population with its small military was able to defeat the larger and better equipped Arab armies.  Now for an aside.

As I noted above, people have been invading places that already have people in them forever.  Once a certain scale is reached with respect to one of these invasions it gets called a war.  If you go back far enough it turns out that almost every nook and cranny of Earth has been fought over.  (The only exception I can think of is Antarctica.)

In some cases a particular piece of land has only been fought over a few times.  In most cases it has been fought over many times.  As a result of this, a case can pretty much always be made that the current occupiers of a piece of land are not the rightful owners.  Some other group, perhaps several other groups, have a reasonable case for why the land rightfully belongs to them.

Mankind has built up a lot of experience with this sort of thing.  Often the same group of people will have been on the winning side of one conflict and the losing side of another.  In situations like his informal conventions grow up.  One such informal convention is called "The Laws of War".  It sets out expectations for what the winners can expect and what the losers can expect.  

The Laws of War cover many contingencies.  But I am going to only consider one.  What happens if a foreign army invades but is successfully and decisively repulsed by the current occupants?  The Laws of War says that in this situation all prior potential claims to the land in dispute by the invaders and their supporters are washed away.  War settles all ownership questions in favor of the current occupants.

That's what happened in the Israeli War of Independence, which was fought in the late '40s.  An army made up of contingents from the several Arab countries invaded Israel.  The intention was to drive the Jews out and return control of what the Jews had named Israel to the local "Palestinian" (Arab) population.  But the invaders lost.  And that loss meant that the outcome of the war conveyed clear title of Israel to the government the Jews had set up.

Prior to this the various Arab countries involved had adhered to the Laws of War just like everybody else.  For instance, there was no "Sharia Law" provisions that contradicted the precepts laid out there.  Arab had been going to war against Arab and non-Arab for Millenia by this point.  In all that time they never found a need to create an alternative to the Laws of War that were in use elsewhere.

But for various political reasons the Arab countries surrounding Israel pretended there was some kind of "Israel" exception.  Part of it, I am sure, is that they lost a War that everybody expected them to win easily. A look at the size, level of training, equipment, etc. indicated that the Arab side was strongly favored.  So, the loss was an embarrassing one.  That left the several Arab governments involved looking for and excuse.

One quickly came to mind.  The loss wasn't a loss.  It was just a temporary setback.  The Arab armies would soon be back and they would be triumphant this time.  Once this idea took hold, the various countries involved told the Arab population to flee from Israel to "temporary" refugee camps.  The camps were only temporary because the Jews would be gone any day now.

And, since their situation was temporary, these refugees were kept separate from the regular population.  After all, they would soon be returning to "Palestine", Israel, but under Arab control.   But these refugees have never been able to return.

The "temporary" camps became permanent.  But since they were always on the verge of no longer being necessary, at least according to the various Arab governments, no effort was ever made to upgrade them.  This created a large group of permanent refugees with a grudge.

By now, generations of "Palestinian Refugees" have been born and raised in these camps.  But the fiction that Israel is a temporary country has persisted in Arab politics.  Everybody now knows that it is a fiction.  But it persists because it serves the needs of the various Arab countries for it to persist.  The fact that it is a fiction doesn't inhibit it from gumming up the works when it comes to making actual progress on many fronts.

Surprisingly, this fiction on the part of Arabs that Israel is a temporary country that will soon be gone also serves the needs of Israeli politicians. The imminent demise of Israel means that Israel faces an existential threat.   If you facing an existential threat the Laws of War permit the taking extreme measures to overcome the threat.

The political history of Israel neatly divides itself into two periods.  There is the "Labor" era and the "Likud" era.  For the first several decades of the country's existence the Labor party dominated its politics.  Labor believed in trying to come to an accommodation with the Palestinians.  "Palestinian" has come to mean the Arabs living in Israel or the "Occupied Territories".  (It also applies to the refugees living in the camps, but I am going to ignore that alternate usage from here on.)

The problem is that they never found a willing partner.  Fault can be found with this, that, or the other proposal the Labor Government of Israel made.  But the Palestinians never seriously engaged.  They used the "Israel is a temporary state" idea as justification for this behavior.  As a result, Labor found that, for the most part, they were negotiating with themselves.

History never stands completely still.  During this period two additional Wars that are of interest to this discussion were fought.  Their names are the "Six Day" War and the "Yom Kippur" War.  If Israel had lost either, the country would have gone out of existence.  In the interests of brevity I am only going to make a few observations about these Wars.  The first happened in the '60s and the second in the '70s.

In the Six Day War the Arabs were clearly the aggressors.  So, according to the Laws of War, this War again confirmed Israel's right to existence and washed away any competing claims.  The situation with respect to the Yom Kippur is more ambiguous.  The Arab armies were posed to strike but Israel fired the first shot.  So Israel couldn't use the "they shot first" justification.  But the Laws or War also address this situation.  The fact that Israel shot first is not enough by itself to justify an ownership change.

One other thing, Israel gained control of a lot of land as a result of these Wars.  Israel then tried to do a "land for peace" deal.  They "voluntarily" relinquished some but not all of the gains.  The Palestinians/Arabs did not respond with "voluntary" countermoves, or pretty much any moves at all.

As a result of the by then long and spectacular lack of success, political leadership eventually shifted in Israel from the more moderate and accommodating Labor party to the hard line Likud party.  It has not shifted back since.  In the Likud era interest in coming to peace with Palestinians and Arabs has waned considerably.

For several decades now sentiment on all sides has been locked in place.  Israel is militant about defending itself from various slights, both real and imagined.  It has slowly clamped down on Palestinians living in Israel.  In the early days the Jews saw Palestinians as a source of cheap unskilled labor.  The Jewish population was not large enough to do all that needing doing.  So Palestinians were a critical resource.

This has slowly changed.  Palestinians have chafed under Jewish restraints.  There are he three "P"s: pay, power, and prestige.  Jews get the high "P" jobs.  Palestinians get the low "P" ones.  As resentments have built Israel has learned to depend less and less on Palestinian labor.  That has allowed them to crack down even harder.  Officially, the crackdowns are all in response to various Palestinian "provocations".

But what's going on is a vicious cycle.  The Palestinians agitate.  The Israelis treat the Palestinians badly.  Some Palestinians use this as an excuse to do something bad.  The Israelis retaliate.  Then they crackdown further.  Wash.  Rinse.  Repeat.  There was an interest by Israelis during the Labor era in trying to break the cycle.  But there was little or no effort to reciprocate on the Palestinian side.  Eventually Israelis in large numbers gave up and put Likud in power.

In the Likud era Israel has engaged in several military offensives outside of its borders.  The anti-Israeli factions within Israel have consistently gotten lots of help from the various countries that surround Israel.  These "incursions" have attempted to punish these outside groups, or at least diminish their capabilities, military or otherwise.  The degree of success has varied considerably.  Needless to say, this has made various groups of Arabs very unhappy with Israel.

And that finally brings us to the current situation.  Both sides have arguments as to why "the other side started it".  But tit for tat has been going on for a long time.  There is always some handy event that can be pointed to when justification is needed.

The fundamental problem is that the conflict remains unresolved.  Also, significant motion toward a real solution has been absent for a long time now.  Someone on one side does something provocative.  That stirs up the other side and things escalate.  Then things either simmer down or they heat up.  As I write this things are still heating up.

Rather than gong into the "he said - she said" that surrounds the current escalation, let me just note that currently escalation serves the needs of both sides.  There is a long simmering conflict between various Palestinian factions.  It suits one faction's needs that a lot of rockets are flying into Israel.  The same thing is true on the Israeli side.  It serves the needs of one Israeli political faction to be seen "fighting a Palestinian uprising".

On paper, it is a one sided conflict, assuming other countries stay out of it.  (This is a likely outcome, at least if the conflict does not significantly exceed its current level of intensity.)  The Israelis have he military capability to do serious damage to the Palestinians.  The Palestinians lack a similar capability to do serious damage to Israel.  But the Palestinians continue to lose battle after battle while they make steady progress toward eventually being able to win the war.

This is because Israel is in big trouble over the long term.  In the first few decades after the establishment of the country its Jewish population increased quickly.  A lot of Jews from the rest of the world moved to Israel and settled there permanently.  But this population influx slowed to a trickle many decades ago.  Meanwhile, the Palestinian population has continued to increase steadily.

This "Population Bomb" problem for the long term stability of Israel has been well known for a long time now.  And for a long time it was thought to be Israel's only serious long term problem.  But Israel now has second problem to go alongside the first one.  For a long time Israel's government was honest and competent.  It has lost considerable ground on both fronts in the past couple of decades.

Israel is a multi-party parliamentary democracy.  For a long time two parties, (i.e. Labor, Likud) controlled most of the seats in the Knesset, the Israeli Parliament.  One of the two parties would prevail in the election and the other would form the backbone of the opposition.  Neither party might command a clear majority, a necessity for forming the government.  But they only feel short by a little.  They could usually rope in a small party without too much trouble and govern pretty much like they actually did hold the majority.

This in turn allowed a party to create and implement a long term political agenda.  They would have to make concessions to the small party.  But the concessions were manageable.  The small party got to punch above its weight but mostly things went the way the large party wanted them to.  And these coalitions would hold together for long periods of time.  In short, the system worked.

But the percentage of seats going to the two largest parties has been in long term decline.  That means that more votes need to be corralled in order to form a governing coalition.  The small parties know that this gives them a stronger bargaining position.  They have taken advantage of this to demand larger and larger concessions.  That in turn has resulted in the dominant party retaining less control. It has also meant that coalitions fall apart more and more quickly.

A successful Prime Minister must more of a wheeler-dealer than a leader.  And keeping policies stable for any length of time becomes harder and harder.  Everything becomes about the short term.  And it becomes about doing what it takes to keep the coalition together for a little longer.  This further incentivizes members of the small parties to misbehave, so they do.  All this gets in the way of good government.

Benjamin Netanyahu has been the Prime Minister for a long time.  He is the leader of Likud.  But the length of time he has managed to remain in office is misleading.  For the past few years frequent elections and a near constant reshuffling of party coalitions have become the norm.  They are now so common that it takes a score card to keep track of who's in, who's out, who is in charge of which ministry, and what the price of their support was.

Not surprisingly, he has been investigated for corruption several times now.  He has shown consummate skill at putting together coalition after coalition.  It has left him little time for anything else.  Is it natural to wonder whether or not he has perhaps stepped over the line a time or two.

He has been on the brink of going on trial for corruption for some time now.  The latest try at getting him in court and before a judge and jury is supposed to happen any day now.  This, and other circumstances, have weakened him to such an extent that he has not been able to keep a coalition together for any length of time.  The result is a Parliamentary Crisis leading to an election every few months.

There has been no recent dramatic shift in voting patterns, so Netanyahu keeps holding on by a thread.  But even he has proved unable to form a government after the last election.  The leader of the biggest opposition party was recently given the task after the clock ran out on Netanyahu's chance.  That's where things stood when the latest clash suddenly erupted "out of nowhere".  Now, hey - there's a war going on.  Netanyahu likely thinks that's good news for him.

Even absent a War (or whatever you want to call the current clash) Israel has become less and less governable.  And the constant state of not peace makes it an unpleasant place to live.  So, it is having trouble attracting and keeping top talent.

And the Palestinians keep getting more and more militant.  It really doesn't matter how strong their case is.  It's strong enough to convince them.  And, of course, the case justifying Israeli behavior is strong enough to convince Israelis.

The Palestinians pay a steep short term price for the resulting escalation.  But the Israelis pay a long term price for the same escalation.  This is not a sustainable situation.  At some point something is going to give.  Lots of people, including the U.S. and Egyptian governments, are trying to get things settled down again.  In the short run that's the humanitarian thing to do.  But I'm not sure it is the right thing in the long term.

The current pattern of behavior by both sides has led to the long term gridlock that characterizes recent history.  We have short periods of a lot of misery separated by longer periods of not quite so much misery.  And nothing gets settled, ever.  So why don't we try something else.  Let both sides go at it.

If I had an alternative that I thought had a chance of yielding a solution both sides would be willing to live with, I would not entertain this idea.  But I don't.  And I don't see any ideas from anyone else or from anywhere else that look good either.

The smart money says that letting both sides go at it for an extended period of time would mean a lot of bloodshed by Palestinians.  It would also be bloody for the Israelis but to a far less extent.  But the vaunted military might of the Israelis has not always delivered victory.  They have even suffered at least one humiliating defeat.  The smart money says that they can hurt the Palestinians badly at little cost.  But the smart money has been wrong before.

Maybe the Palestinians will be forced to change their position and decide it is finally time to cut a deal.  Maybe the Israelis will find military action expensive and unproductive and decide to cut the Palestinians more slack.  Either outcome would change the status quo.  And it might result in less bloodshed over the long term.

We know what will happen if things settle down after a few days with nothing really decided.  The status quo will continue.  The status quo is not working in the long term for anybody.  Unfortunately, it is working in the short term for powerful factions on both sides.  That's why it is the status quo.