Sunday, January 10, 2021

The Rehabilitation of President Carter

"To everything there is a season, and a time for every purpose under heaven."                         - Ecclesiastes 1:3

It's from the Bible.  It was subsequently turned into a folk song that was very popular in my youth.  It is a poetic way of saying that things go in and out of fashion, then back in.  Jimmy (he insisted everyone call him "Jimmy" rather than James or some other, more formal option) Carter was our thirty-ninth President.  (He was President a long time ago, but he is still with us.)  His Presidency has generally been considered a failure since shortly after he left office in 1981.

Since then, he has become beloved, not for anything he did as President, but for what he has been doing in the forty years since he left office.  Most people only know him for this later period.  They know little about him as POTUS.  But I am going to focus on Carter as President.

Until extremely recently, that was a subject no one wanted to talk about.  But a few months ago Jonathan Alter put out a book called His Very Best:  Jimmy Carter, A Life.  Okay, so some political wonk puts out a "politics" book.  But it's not just Alter.

CNN recently aired a two hour movie called Jimmy Carter:  A Rock and Roll President.  That's surprising because Carter is known for his piety.  He's still married to the same woman.  He still lives quietly in the small town he was born in, Plains, Georgia.  That is not a standard "Rock 'n Roller" profile.

I have seen the CNN show.  It makes a compelling case that the title of the show is accurate.  I have purchased but not read the Alter Book.  My head just has not been in the right space to tackle it.  But I promise you I will read it at some point.  And I may even review it in this blog.

On the other hand, I lived through the Carter Presidency.  So this post will be my perception of Carter's term rather than a deeply researched look at the subject.  And we can't start at the beginning.  We need to start before the beginning.  And that's with Richard Nixon.

Nixon was very familiar with failure and the prospect of failure.  He was an up-and-coming pol in '52 when the extremely popular Eisenhower tapped him to be his running mate.  Then it emerged that he had a slush fund, a fund put together by some of his wealthy supporters to cover "miscellaneous expenses".

It was your standard "unvouchered funds" account that shows up so frequently in spy novels.  Nixon could spend the money however he wanted and did not have to get anybody's approval to do so.  Additionally, no records were kept concerning what the funds were spent on.

 "It was a different time", as they say.  And this was considered such a big deal that it could have gotten Nixon kicked off the ticket.  Nixon redeemed himself by giving his famous "Checkers" speech.  It is a brilliant piece of work.  I'll skip over the details but I recommend you find it on the internet.  it is well worth the roughly twenty minutes it takes to watch.

The goal of the speech was to justify the fact that people had given Nixon gifts.  He ended the speech by talking about a dog that someone had given his daughters.  He vowed that, no matter what happened, he was not going to give the dog back.  The name of the dog was Checkers, and that's where the name of the speech came from.  The speech worked.   Eisenhower kept Nixon on the ticket.  And the pair served two terms.

Then Nixon lost to Kennedy in '60 when he ran for the top spot.  He again lost in '62 when he ran for Governor of California.  But he found his way back from the political wilderness to win the Presidency in '68.  You have to admire his tenacity and the political skill that this almost impossible move demonstrated.

What he was ill equipped to handle was success.  He somehow managed to become very popular as President.  I was there and he did NOT rise in my esteem.  But I was in the minority.  And that brings us to the '72 Presidential campaign.

Nixon had everything going for him and he used his advantages wisely.  First, he took nothing for granted.  He assembled a top tier campaign organization.  He ran a smart and very aggressive campaign even though he was running against a weak opponent.  He raised what was then considered a shitload of money.  (Now, it would be considered pocket change.)  And it turned out that it was the money that did him in.

His campaign staff, the "Committee to Reelect the President", abbreviated CRP by the staff and CREEP by everybody else, laid out a spending plan.  Everything was generously funded.  No stone was left unturned.  The problem was that they still had money left over.  It seemed a sin to waste it.  (Again, it was a different time.  Nixon never considered just pocketing it.)

So, everybody tried to think up schemes on which to spend the extra money.  There was no need to eliminate even the most hairbrained ones.  As a result the "plumbers" ("we find and fix leaks") were organized and funded.  Their job was to spy on the opposition.

Famously, we learned a couple of years later, that these plumbers broke into the offices of the Democratic National Committee looking for secret plans the Nixon people didn't already know about.  The DNC offices were located in an office building that was part of a group of buildings known collectively as "The Watergate Complex".  If Nixon had raised less money, or if his Democratic opponent had been stronger, "Watergate" would probably never have happened.

The Watergate scandal played out on TV news over several years.  So, it made a strong impression.  Nixon resigned and was replaced by Ford, a good and decent man.  But Ford ended up with a stain on his reputation.  He pardoned Nixon.  Most say that was the problem but I think it's more subtle.  I think the problem stems from the way he pardoned Nixon.

Ford, for reasons I am not going to get into, had to be confirmed by the Senate.  There he promised he wouldn't pardon Nixon.  Then he assumed office and continued to say in no uncertain words that he would not pardon Nixon.  Then one day, out of the blue, he pardoned Nixon.  I understand the thinking behind his decision to pardon Nixon.  I disagree with it but I understand it.  And it was his decision to make.

But I think it was completely wrong to pardon Nixon without laying the ground work by first publicly entertaining the possibility of pardoning him.  That would have set off a political firestorm.  But that's okay.  If you make a decision that you know will be unpopular you should expect to take some heat over it.

I'm sure Ford did not telegraph his decision because, for various reasons, he wanted to avoid the heat.  Or at least to put it off until after the deed was done and couldn't be taken back.  If an objective was to reduce the amount of heat he was ultimately subjected to, he failed.  And the combination of the pardon and he way he handled it was enough to move Ford out of the "nice guy - straight arrow politician" category and into the "standard sleazy politician" one.

And that's the runup to the '76 Presidential campaign.  Carter succeeded in selling himself as a nice guy, straight arrow politician.  It was only a few years after Watergate, and it was an even shorter time since the Nixon pardon, so the public was yearning for what Carter was selling.  As a result, he won.  And that brings us to the heart of the issue, his term as President.

Going in he had a hard reputation to live up to.  But he worked to live up to his reputation.  One thing that I didn't understand at the time was how unprepared he was for the job.  He had been Governor of Georgia when he ran.  The experience of being Governor stood two very different people in good stead, namely Ronald Reagan and Bill Clinton.  So, why did they succeed while Carter failed.

Ragan had been the Governor of California.  California is an activist state with a large, sprawling bureaucracy.  It takes a team to run something that big.  Reagan had successfully assembled a team that was large enough for him to to succeed as an administrator.  It takes an even bigger team to run the U.S. government.  But he had the basic idea down pat when he entered office.

Georgia is a southern state.  They believe in small government and a light regulatory hand.  That meant that the Georgia bureaucracy was easier for one person with a small staff of assistants to manage.  And that staff, nicknamed "the Georgia mafia" was pretty much all Carter brought to Washington D.C.

Unfortunately, he had little or no connections and little or no experience in putting together the large team necessary to manage the Federal Government.  He was never able to get the hang of it.  He was known as a hands-on micromanager.  The Federal Government is too big and too complicated for that to work.  That left him isolated with few of the connections he needed to succeed.

Usually, when an Administration leaves office a lot of people stay behind and stay on in some role or another.  The Carter Administration left very little behind because they never integrated themselves into the network of connections and relationships that is the lifeblood of D.C.

Clinton was from a southern state that is even smaller than Georgia.  So, on paper he was in an even poorer position to become successful.  But Clinton was a champion networker.  He found a way to meet everybody and to know, or know something about, everybody.

Like Carter, Clinton brought only a small team from Arkansas to D.C.  But he was quickly able to expand it by making use of all of the connections he had developed.  He was able to put a team together that was somehow his team.  But they were also able to plug themselves into the D.C. network.  As a result he left a lot of Clintonites behind that continued to be powerful and successful long after he left office.

Back to Carter and his time in office.  Most people have forgotten just how consequential his term in office was.  We have perhaps had a surfeit of consequential Presidents since.  And a lot of people have the excuse that they hadn't even been born yet or, if they had been born, they were a small child at the time.  Others have, for one reason or another, blanked the period out.  Let me give you some examples of just how consequential his term was.

The energy debate started in earnest under Carter.  It was initially of primarily academic interest, but that changed quickly.  At the time the U.S. economy was heavily dependent on cheap oil.  Europeans, for instance, drove smaller, more fuel efficient cars than Americans.  Why?  Because due to high taxes and government policy, gas was twice as expensive there.  American drivers could afford to drive "gas guzzler" cars, because gas was cheap.

Then OPEC flexed it's muscles.  For a long time Middle East oil was controlled by large American, British, and French companies.  Not surprisingly, they favored low prices.  But gradually the Arab countries gained control of these companies.  Then they formed OPEC.  But it was pretty toothless in the early years.  Then these same countries got mad at the U.S. due to the '73 Yom Kippur War in which Israel yet again dispatched a coalition of Arab armies with substantial American help.

The Arab members arranged an oil boycott against the U.S.  This put the U.S. economy into a tight squeeze.  It was mostly for show.  Oil is a "fungible commodity".  That means that oil is oil is oil.  Although U.S. was cut off from Middle East oil there was lots of oil available elsewhere, including in the U.S.  All the U.S. had to do was to shift suppliers, which they did with little difficulty.

But various people took advantage of the optics. And the press fell down on the job.  So various crises', some real, some phony, occurred.  A real effect of all this chaos was that Arab countries were able to jack up the price they could charge for their oil.  In reality, various short term disruptions were quickly dealt with.  But the "crisis" atmosphere that prevailed allowed American oil companies to substantially increase their profits.

And Carter got caught up in one of these crisis cycles.  OPEC raised prices substantially early in his term.  All parts of the U.S. oil industry took advantage of this to increase their leverage and their profits.  All of a sudden, we had lines everywhere.  And that panicked people into lining up even if their tank was two third's full.  Market manipulation and price gouging was rampant.  But Carter never developed an effective strategy for dealing with this.  And lots of people were happy to send the blame this way.

Carter successfully diagnosed the situation as pointing to a need for the U.S. to reduce its dependence on fossil fuel.  Global warming was not a thing at the time so it didn't play a part.  But his diagnosis of the long term problem was completely accurate.

The biggest problem was that he didn't have any alternatives that would be effective and that could be brought online quickly.  For instance, he put solar panels on the roof of the White House.  But solar panels, while being a nice symbol, were not practical at the time.  (They are now.)

His other initiatives were more practical but also very unpopular.  One thing that would help in the short term would be for everybody to turn down the thermostat in homes and buildings.  It would only require people to dress warmer, wear a sweater, for instance.  A more long term solution would be to increase the efficiency of cars, appliances, buildings, etc.

These, and other ideas, made a lot of sense from a cost effectiveness perspective.  But no one wanted to hear of it.  And there were many entrenched interests that favored low efficiency.  They cranked up their respective PR machines and unloaded.

Carter response to the largely manufactures "oil crisis" was monumentally unsuccessful, but he did lay a foundation that others were later able to build on.  And Reagan started the GOP trend of blindly opposing this sort of thing.  He publicly had the solar panels removed from the White House.  He rolled back other energy efficiency measures that Carter had begun.  This would not be the only issue where Carter was ahead of his time.

Most people have forgotten "Love Canal" by now.  But it was the first time the public became alarmed by the medical dangers of pollution.  Carter put the "Superfund" law into place.  It was the first mechanism specifically designed to clean up polluted land.  It was wholly inadequate.  Cleaning up pollution is far more difficult and expensive than doing the polluting in the first place is.  But it was a start.

The public was alarmed enough at the time to pressure Congress into passing the Superfund bill.  But, as the costs and difficulties have become apparent, a backlash has since developed.  We now have two entrenched sides on this issue, essentially the "pro" and "anti" pollution factions.  But taking pollution seriously, even if its to oppose mitigation measures and regulations, started with Carter.

Carter was a fiscal conservative. In that, he joins many other Democratic Presidents.  He inherited a fiscal mess from his predecessor.  By modern standards it was a tiny mess.  But it was a big deal at the time.  Carter got Federal spending under control, and with it, the deficit.

But, in a pattern that should now be apparent to everyone, his Republican successor (Reagan) reversed this situation and implemented spendthrift policies.  All the GOP Presidents that followed Reagan have also been spendthrifts.  Now doubt, the pattern will continue as Biden takes over from Trump.

Carter was a pioneer in another area, deregulation.  Starting with the Great Depression and FDR, various administrations had used extensive regulation to try to manage the economy to stability.  Federal agencies set prices and controlled entry to markets in many industries.  There was a broad consensus that this trend had gone too far by the time Carter entered office.  Carter decided to do something about this, and succeeded.

He started with airlines.  Prior to his Administration, they had to file for changes in rates, changes in their route structures, and much else.  Carter changed this by "deregulating" them.  Airlines became free to raise and lower rates, to add and remove routes, and much else.  It now became possible for anyone with enough money to start an airline.  The result was, in part, discount airlines like Southwest.

The result was also bankruptcy, a substantial decline in the quality of service, some markets being overserved and others underserved, or even having no service at all.  But overall, we went from a period where flying was only for the rich to a period when flying is by far the cheapest way for people of modest means to travel long distances. 

Most people think that airline deregulation has made us better off.  But the change has definitely had its plusses and minuses.  There have been "bailouts" by the Federal Government of too many airlines to count.  The current COVID related crisis has resulted in more bailouts of the airline industry by the Federal governments.

This bailout behavior should be vigorously opposed by Republicans on philosophical grounds.  But it is not.

This deregulation business started under Carter and has continued since.  Even Carter didn't confine himself to airlines.  He also deregulated trucking.  His successors have deregulated many other industries or industry segments.  The deregulation of the financial sector is widely blamed for the crash of '08.  There is now talk of reregulating.  But nothing has actually come of it yet.

Carter began the U.S. involvement with Afghanistan.  At the start of his term no one would have been able to find Afghanistan on a map.  By the end of his term that had changed completely.

Sticking with the highlights, the Russians invaded Afghanistan.  Carter saw an opportunity to "Vietnam" the Russians and he took it.  There is a great book (and movie) called Charlie Wilson's War, if you are interested in the details.  The bottom line is that under Carter the Russians did get Vietnam-ed at a tiny cost to the U.S.

The post-Carter approach to Afghanistan has been bungled badly by several Administrations.  As a result, we find ourselves investing American troops and pouring vast quantities of money into the country more than forty years later.  But again, the U.S. involvement with Afghanistan started with Carter.

Like many Democrats before him and many after, Carter took a stab at healthcare.  Like all but Johnson and Obama, he failed to make significant progress.  Where Carter did have more success was in the area of education.  The Department of Education was created at his insistence.

He also saved the Chrysler Corporation, then one of the "big three" U.S. car companies.  Like Obama's later auto industry effort, it was a success.  And Chrysler was bailed out at zero eventual cost to the U.S. taxpayer.  Chrysler paid all the money taxpayers loaned it back in full, and with interest.  (Obama saved the entire U.S. auto industry.  It also paid everything it was loaned back in full, and with interest.)

Before moving on to the rest of the world, specifically the parts outside of Israel and Afghanistan, let me spend some more time on the economy.  Carter was dealt a difficult hand.  The economy was in bad shape when he entered office.  Then relatively early in his term, oil prices shot up.  This second price increase happened only a few years after they had first shot up. The result was something many are no longer familiar with, inflation.

There is (or at least used to be) something called the wage-price-spiral.  Workers get their wages increased.  (Remember, it was a different time.)  Businesses raise prices in order to be able to pay for the wage increases.  This, in turn, results in demands for still higher wages.  Rinse.  Repeat.  The result is that both wages and prices spiral upward.  And, at the large scale economic level, what we see is prices on everything increasing.  And that's called inflation.

The economy got caught up in a wage-price-spiral during Carter's term.  There is a way to deal with this but Carter didn't go there.  Famously, there was a trucker's strike.  A number of economists at the time were saying "let them strike.  It will take some steam out of the economy and break the wage-price-spiral".  Carter instead jawboned the trucking industry into setting the strike by raising wages.

And the race was on.  Wages and prices spiraled upwards even faster.  Interest rates for super-safe government securities moved to above 15%.  For contrast, those same types of securities now fetch interest rates below 1%.  Then the portion of the Federal budget devoted to paying interest on government bonds was high.  Now is is tiny.

I think President Carter made a mistake.  In the short term it was good for truckers.  Their income went up.   But in the long term everybody lost.  This is not the only time Carter came down on the side of a short term gain at the expense of a long term loss.  I will get around to another example below.

It is also not a mistake his successor made.  Reagan went to war with the unions.  Strikes ensued.  But in the short run, and later in the long run, the unions lost.  The back of the wage-price-spiral was broken.  This was hard on the economy for a year or so.  But the economy benefitted over the longer term.

Finally, it was a different time.  Back then, the U.S. economy was largely self contained.  It has since been internationalized.  That gives companies little ability to raise prices.  They can also outsource jobs.  That means the bargaining power of workers is much diminished.  And that makes it hard to imagine a wage-price-spiral either starting or continuing. 

And the FED and the FED's counterparts in other countries, have all found that they can drive interest rates down as low as they want them to go.  And they keep finding reasons to want interest rates to be very low.  That means that it is hard to imagine the interest rate on government bonds rising very far. 

On to the foreign sphere.  I am going to leave the big one for last.  I'll start with another consequential move by President Carter.  He returned the Panama Canal to control by the Panamanians.  This was an extremely controversial move at the time.

There was lots of fearmongering on the "anti" side of the argument.  The Panamanians won't be able to run the canal.  (They have now been doing a fine job for decades.)  The U.S. military will suffer some severe loss.  (Nope!)  The U.S. will look weak and lose prestige.  (We looked strong and gained prestige.)  And on and on.

At the start of Carter's term the U.S. actually had a poor reputation in the third world.  Starting with Iran in '53, and moving on to countries too numerous to easily keep track of in the '50s, '60s, and early '70s, the U.S. had been fomenting revolution and pulling strings behind the scenes all over the third world.

And often we replaced a democratic institution with an authoritarian regime.  After all, we replaced a democratically elected government with a dictator in that very first Coup we instigated.  The "sins" of the democracies we ousted typically consisted of some combination of being too hard on American corporations and/or being insufficiently anti-communist.

As a result, in most of the third world the U.S. was seen by the bulk of the population as being imperialists and colonialists.  By returning the Panama Canal and by other actions, Carter reversed that perception in a short four years.  The U.S. reputation went from "bad guy" to "good guy" in what seemed like overnight.   Our reputation has never been as high since.

Central and South America, for instance, started seeing us not as "Yankee Imperialists" but as being on the side of freedom and democracy.  The number of countries that transitioned from authoritarian rule to democratic rule in the Carter era is astonishing.  No one else has come close.  In fact, in most Administrations the number of democracies goes down and the number of authoritarian states goes up.

Carter was equally successful, perhaps more successful, in Africa.  And right next to Africa is the Middle East.  He worked hard to bring peace and tranquility to the region.  His great success was the "Camp David Accords".  They brought about peace between Israel and its most powerful neighbor Egypt.  And it was a personal triumph for Carter for which he later received the Nobel Peace Prize.  He spent thirteen days personally doing shuttle diplomacy between the Israelis and the Egyptians.

President Trump has succeeded in breaking the Middle East out of the gridlock that has gripped it in the decades since Camp David.  It is too soon to tell if these actions will have a long term benefit.  But prospects look good at the moment.

And that brings me to Carter's big failure, the Iran Hostage Crisis.  I think he pretty much bungled it from start to finish.  Remember the first country the CIA was able to pull a Coup off successfully in?  And don't forget that, at the time, that Coup was seen as being so successful and so easy to pull off, that it became the model for dozens of later attempts by the CIA to replicate its results.  Yes!  That Coup.  The one that put the Shah of Iran into power.

Those particular chickens came back to roost late in Carter's term.  The Shah had gotten old and lost a step or two when it came to manipulating the levers of power.  And he had put no obvious successor in place.  That produced an opening, and religious radicals moved into that opening.  If the Shah had been on his game he would most likely have been able to handle them.  But he wasn't.

Carter's first mistake was to allow the Shah into the U.S. for much needed medical treatment.  He did this primarily at the instigation of Republicans, but they have managed to successfully dodge blame.  In any case, a sick Shah in the U.S. was opening enough for the religious radicals to successfully pull off a Coup.  It was what happened next that was critical.

The radicals stormed the U.S. Embassy and took everyone still there prisoner.  (A bunch of people had managed to get out and sneak into the Canadian Embassy.  See the book and movie Argo, if you are interested in learning more.)  Here's where Carter made his second mistake.  And he made this one all on his own.  And it was another example of doing something for short term gain that ends up having long term costs.

If I had been in charge, I would have immediately issued an ultimatum:  Release them all within 48 hours or expect fire to rain down from the sky.  I believe the Iranians would have caved and that would have been that.

But I could be wrong.  It's possible that instead a bunch of U.S. diplomats would have ended up dead.  Even if that happened it would still have been the right thing to do.  Since then, for four decades and counting, diplomats have been in danger all over the world.  And its all because Carter failed to take a hard line.

At the time Carter used a number of excuses, which I am not going to bother listing, to dither rather than reacting forcefully.  The result is the famous standoff.  It made Ted Koppel famous.  At the time ABC had some dead time after 11:30 in the evening.  The "Kimmel" show now occupies that time slot, but at the time ABC had nothing.

ABC News executives picked a then second stringer named Ted Koppel and told him to fill a half hour every night.  His one and only topic was the hostage situation. The show was titled "America Held Hostage" for a while.  That gives you a feel for the flavor of the content.

Koppel turned out to be brilliant.  He managed to find a way to fill the time and look good doing it.  He soon became a first stringer.  He stayed a first stringer until he retired many years later.

The whole business went on for 444 days.  Carter managed to secure the release of the hostages in the end.  But it was his successor that got all the credit because the Iranians decided that was to their advantage.  But wait, there's more.

Eventually a rescue attempt was staged.  It was another first.  The whole thing was monitored in real time by satellite from the White House Situation Room.  That's now standard fare, both in the movies and in the real world.  But this was the first time it was actually done.

The mission was a fiasco.  The primary group responsible was the CIA.  They refused to share weather data with the Pentagon.  A dust storm, not an uncommon phenomenon, came up and wrecked havoc with the helicopters used.  The only reason it came as close to success as it did was due to Carter's interventions.

But the CIA and the rest of the military were let off the hook by Carter.  If I had been in charge, I would have fired the Chairman of the Joint Chiefs of Staff.  I would then have put the second in line in charge of a witch hunt.  "Find me some people to blame then fire them" would have been his marching orders.  But Carter chose to do nothing while others maneuvered for political advantage.

Then there's the fact that he spent the last hundred days of the campaign holed up in the White House obsessing over the hostage crisis.  But, as I have indicated above, not taking effect action with respect to it.  The polls said the race was close up to about two weeks before the election.  Then polling stopped and a big shift in sentiment was missed.  (Sound familiar?)  Reagan won in a landslide, but I think Carter could have beat him if he had campaigned vigorously and smartly.  But he didn't, and that's how he came to be a one term President.

As we all now know, Carter has been a superior ex-President.  While being President was a poor fit for his abilities, the role of ex-President has turned out to be a perfect match.  He went back to Plains where he set up the Carter Library and the Carter Center.  Both are the kinds of small, focused operations he is well suited to lead.

And, while he didn't network with the D.C. establishment, and he especially didn't network with Congress, it turns out that he did network with heads of state and other world leaders while he was in office.  He has since leveraged those connections into playing a powerful and positive role on the world stage.

And he continues to not shy away from hard problems.  He has attacked Guinea Worm, a devastating disease in the third world.  He has set out to eradicate it completely.  He has yet to succeed, but he has made tremendous progress.

The Carter Center has also become the "go to" organization when it comes to monitoring elections.  It is seen as competent, fair, and impartial.  If the Carter Center says that an election has been run fairly and well, they get believed.

An area where he has had seen less success is in the field of diplomacy.  He has offered to be an informal spokesman and troubleshooter for the U.S. in sticky situations.  But, for one reason or another, Administrations don't trust him to stay on the reservation.  So they have made use of him in far fewer situations that he would prefer.

It is important to recognize that all the actions he has taken since he returned to private life have reflected well on the U.S.  The same can not be said for many of the moves made by many of the Administrations that have followed him.  That's not a bad reputation to have to carry around.

Friday, December 25, 2020

Gravity Waves - An Update

 This is an update to my October 2017 post on the subject.  Here's the link:  Sigma 5: Gravity Waves.  It's been more than three years.  Surely, something has changed.  Indeed it has.  But before proceeding both backward and forward, let me review the results I reported in my earlier post.

These results were produced by a "gravity wave observatory" called LIGO.  For more than a decade LIGO had nothing to report.  The reason was simple.  It's detector was just not sensitive enough.  But it went through several generations of upgrades.  That last one (Advanced LIGO) did the trick.  The data collection run, tagged "O1" for "Observation run 1", ran from from September of 2015 to January of 2016. It produced three events.  Each event was caused by two large black holes spiraling together till they merged.

The observatory was then shut down for minor upgrades.  At completion, the O2 run took place.  It ran from December 2016 to August 2017.  O1 and O2 together eventually resulted in 11 events being detected.  When I wrote my post five of them had been reported on.  Since then, another round of upgrades has been installed.  Upon completion, the O3 run was started.  It had to be shut down in the middle so it was informally broken into the O3a run (April 2019 to September 2019) and the O3b run (November 2019 to March 2020).

All together, 56 detection events have been identified.  And a third observatory (LIGO is actually two observatories, one located in Washington State, and the other in Louisiana) has been brought online.  VIRGO is slightly smaller than the two observatories that combine to make up LIGO, but is sensitive enough to detect many of the same events that LIGO can.  With three observatories measuring the same event, its location can be narrowed down to a much smaller slice of the sky.  And, in general, more information about the event can be collected.

LIGO is currently shut down so that still more updates can be installed.  The O4 run is currently slated to start in June of 2022.  And VIRGO is also getting upgraded.  And other facilities will be coming online soon.  They are scattered all over the globe.  There are even plans for "LIGO in space", a LIGO-like instrument that would be bigger than it is practical to go with an earth based observatory.  Once those first observations proved that it was possible to detect gravity waves funding has ramped up dramatically.

But that's enough of the present and the future for the moment.  Let's go to the past.  And let's do it by asking a simple question:  what's the speed of light?

It has been possible to make observations spanning distances like ten or twenty miles for millennia.  Back then it was obvious that sound traveled at a finite speed.  You could observe an action and then note a delay measured in seconds before the sound associated with that action reached you.  That made it obvious that, if light was not instantaneous, then it at least traveled much faster than sound.

Reasonably accurate measurements of the speed of sound were successfully made hundreds of years ago.  We have had a very accurate estimate of the speed of sound for perhaps two hundred years.  And scientists were able to establish that sound worked by oscillating something.  Normally, this was air.  But it could be water.  And the speed of sound through water was higher than that of air.  And sound couldn't travel through a vacuum at all.  So, scientists have long had a good idea of how sound worked.

And the obvious thing to do was to apply what they knew about sound waves to light waves.  If the analogy held then light should have a propagation speed.  But what was it?  "Fast" just doesn't tell us much.  Efforts to determine its speed date back to at least 1629.  Experiments with cannons and the like determined that it was too fast to measure using standard methods.

That led to an effort based on astronomy.  This effort produced the first measurement that was at least in the ball park.  Romer in 1676 made detailed observations of the orbits of the moons of Jupiter.  He calculated that in order for the observations to make sense it must take about 22 minutes for light to cross from one side of the Earth's orbit around the Sun to the other.  That would have been great if astronomers of the day knew exactly how big that orbit was.  They didn't.  The best guess put the speed of light at about 140,000 miles per second.

What was important about this is it told scientists just how small the time intervals were that they would need to be able to measure.  Say they wanted to measure the propagation time of light over distance of 10 miles. At 140,000 MPS light would take 0.00007 seconds to traverse that distance.  A stop watch just wasn't going to cut it.

An early scheme depended on a rapidly rotating wheel with teeth on it.  Taking light as particles for the moment to make the explanation simple, arrange for particles of light to be shot past the wheel and along to a distant mirror.  There they are reflected back, again past the wheel and on to a detector.  If things are arranged such that the light has to travel through the part of the wheel where the teeth are, then it will be blocked when a tooth is in the way but can pass freely when it hits a gap.

Now, spin the wheel at high speed.  If the wheel is spinning at just the right speed then a particle of light can pass through one gap between teeth on the wheel, bounce off the distant mirror, and then return just in time to pass through the adjacent gap.  This setup allows time to be sliced into very small intervals very accurately.  Simple calculations suffice.  And a different sized disk or a different rotation speed can used to fine tune the interval to whatever is necessary.

This admittedly inaccurate explanation gives you the idea.  The point is that with the proper equipment built along these lines things can be arranged so that the light passes through one slit going and a different slit coming.  A wide range of time delays can be accommodated.  It is simply a matter of dialing the setup in.  Once the right combination of rotation speed and disk/tooth size is found, it is a small step to translate the settings into the speed of light they represent.

And, as a result, Foucault was able to come up with a speed of 298,000 KM/s in 1862.  This is very close to the modern value of just under 300,000 KM/s.  Others improved his setup and came up with similar values.  By 1887 Michelson and Morley were confident that they could measure the speed of light very accurately.

Measuring the speed of light very accurately was of secondary importance to them.  Their primary interest was in learning was how fast and in what direction the Earth was moving.  To do that they needed to very accurately measure the speed of light.

In normal circumstances sound travels through air.  Air is the "medium of transmission", the thing that sound vibrates in order to move.  But what was the medium of transmission of light?  It had to be something, didn't it?

And there were all those things that weren't the medium of transmission.  After all, unlike sound, light can easily travel through a vacuum.  And since a vacuum is, by definition nothing, all the usual suspects get immediately eliminated.  So, scientists posited the existence of something called the "luminiferous aether".

Assuming something exists just because its existence is convenient is not good enough for scientists.  They need actual proof that it does exist.  And a good place to start is by trying to measure its properties.  And the fundamental property that aether had was its ability to transmit light.

And it was assumed that, everything else being equal, the propagation speed of light in aether was constant.  That was true for sound and air.

If you kept air moving at a constant speed and kept its temperature constant, and so on, then the speed of sound through it was constant.  Conversely, you could determine some of the attributes of air by measuring the speed of sound through it.  For instance, fast moving air would result in a different measured speed of sound than slow moving air.

So, assuming the parallel held, the speed and direction the aether was moving could be inferred from a careful measurement of the speed of light in various directions and at various times.  And it was assumed that the aether didn't move.  The Earth moved through the aether.  So, differences in the speed of light led to different speeds for the aether, which in turn led to a measurement of the speed of the Earth through the aether.

All this was speculation piled upon speculation and scientists knew it.  But the measurements were expected to turn up something, even if it wasn't exactly what "aether theory" predicted.  And repeated measurements should lead to some ideas about aether theory being discarded and other ideas being confirmed.  That was all par for the course.  But what everybody agreed on was that careful measurements would turn up differences in the measured speed of light.

After all, it was known that the Earth traveled around the Sun at relatively high speed.  And the direction of travel changed with the season.  That amount of change alone should have been enough to change the speed of light by a measurable amount.  The apparatus had been designed to easily and unambiguously detect changes of this magnitude.  If other changes turned up as the measurement process progressed, that would just be a bonus.

The problem is that the Michaelson-Morley experiment turned up the result that no one expected.  And this "unexpected result" phenomenon pops up in Science all the time.  It is a normal part of science.  Scientists expect it to happen regularly.  They just don't know when it will happen and when it won't.  And what this means is that all those "scientists reject my belief, not because it is wrong, but because it doesn't fit in with what they already believe" arguments are nonsense.

If someone provides hard evidence that current scientific thinking is wrong then scientists change their thinking.  That's what scientists were forced to do because Michaelson and Morley got the result they did.  No scientist liked the result they got.  But other scientists were able to reproduce the result in well conducted experiments.  So, scientists had to find a way to live with the result, which they eventually did.  Scientists reject "unscientific" beliefs, not because they are unscientific, but because they are not backed by solid evidence.

Scientists have been forced by the results of experiment to reject all kinds of sensible ideas. They have been forced to accept ideas that were far more weird and unnatural and unbelievable than anything an outsiders has thrown at them.  Why?  Because some well done experiment or observation has forced them to.  And the Michaelson-Morley result was one of many instances of this.

The Michaelson-Morley result eventually led Einstein to publish his Special Relativity theory in 1905.  Their result had dealt a near-fatal blow to the idea that the luminiferous aether existed.  But it wasn't until Special Relativity that scientists bailed completely on it.  The theory worked.  It also did away with the need for aether to exist at all.   The real kick in the pants, however, didn't come until ten years later.  Einstein introduced General Relativity in 1915.  That's when things got really weird.

In 1905 Einstein had built Special Relativity around the idea that the speed of light is constant.  That's pretty weird.  In order to make things work the theory demanded that all these other not-light things must stretch and shrink.  There were still lots of things that remained unchanging.  But still, some things that we had thought were unchanging, changed in these predictable ways in circumstances that Einstein laid out.

Okay.  That's lot to buy, but the Michaelson-Morley result demanded some kind of weirdness.  And Special relativity weirdness was pretty much the minimum amount of weirdness that would get the job done.  The problem is that General Relativity took weirdness to a whole new level.  We're now talking bat-shit-crazy weird.

You see, General Relativity requires space itself to stretch and shrink.  Space, to put it another way, is the luminiferous aether.  And it behaves in many ways like the air that sound travels through.  It's what vibrates to transmit gravity.

Newton said "objects in motion tend to continue in that motion".  Gravity works by literally warping space.   So an object thinks it is continuing to travel in a straight line.  But gravity causes space to warp and that caused the "straight line" course of the object to bend, not because the object has changed direction, but because "straight" is no longer straight.  Like I said, bat-shit-crazy.

And this "space is wiggly" business means that there are such a thing as "gravity waves", instances of space wiggling because, you know, gravity.  And I think you can now understand why I, for one, was not having any of it.  I was not convinced that gravity waves even existed even though lots of smart people whom I deeply respected believed that they did.  But the LIGO results did me in.  They were right and I was wrong.

And I have to admit that I am actually happy that I turned out to be wrong.  Because, as I observed three years ago, "[e]very time something previously invisible has become visible, tremendous discoveries have been made".  And it is important to understand that the first tremendous discovery has already been made.  We now know with absolute certainty that gravity waves exist.  That's a tremendous discovery if there ever was one.

Beyond that, we know that General Relativity computations about the characteristics of gravity waves work pretty well.  For instance, they get their strength about right.  Why not 100% right?  Maybe.  But we know so little about the events behind the observations at this point that we can't say with certainty.  All we know about many events comes from running LIGO data through General Relativity.  That results in, for instance an estimated mass.  Is the estimate correct?  At this point we have no way of knowing for sure.

But even if the calculations are off by some they still tell us things.  Remember that first estimate for the sped of light.  It was close enough to tell us where the decimal point went.  And that was valuable information.

And we now have 56 events to go on.  The first event was scary.  Was it some kind of screw up?  Was it some kind of unusual event or was it pretty typical?  With one event it's hard to judge.  With 56 events patterns emerge.  A lot of the events are two black holes spiraling together to merge into one.  We now have some idea of how common that event is.  One of the early events was a neutron star merging with a black hole.  Scientist got very excited about that one.

There have been some events that fall outside the accepted theories for how these kinds of events are supposed to progress.  The details are complex and I don't really understand them.  But the scientists are very excited by what they are seeing.  It would be nice if everyone else was too.  But they are not.

Something the general public doesn't understand is that scientists are actually happier when the data doesn't conform to current theory.  It's just more fun and interesting to be on the hunt for a new theory to replace an old broken one.  That's as good as it gets.  It's what made Einstein famous.

Next best is to come up with a modification to an old theory.  Sometimes you don't have to throw the whole thing away.  Maybe you change parts of a theory but leave the rest alone.  If the result is that the revised version now fits all the experimental data then that is a very good result.

It's progress but not the best outcome if you can change a theory and the new theory is a better but not a perfect fit for the experimental data.  That's an improvement, but it also is evidence that more work is needed.  Unlike many, scientists expect their theories to agree with all the data, not just most of it.

The scientists that feed off of the LIGO data have gone from not excited to very excited.  Before 2016 they had no data to work with.  That's not very exciting.  Now that they have data, and lots of it, to work with they are very excited.

Unfortunately, things have gone in the other direction in terms of general interest.  There was a flurry of press coverage back in 2016.  Although the first event LIGO observed happened in 2015 it wasn't announced until then.  For reasons I go into in the previous post the first event was suspicious.  No one wanted to make an announcement they would later have to take back.  So there was a long delay while things were checked and rechecked.

Fortunately, the second event came along pretty quickly.  That's when I and many others relaxed.  It was real.  And a third event followed shortly thereafter.  And VIRGO came on line.  This was enough to maintain interest until about the summer of 2017.  I wrote my post in October of 2017.  It turns out that interest by the press and by the public was already waning by then.  Press coverage since has been almost nonexistent.

But the data keeps pouring out.  The upgrade from the O1 setup to the O2 setup was modest.  But it was enough to increase the rate of event detection.  The modest upgrade that was sandwiched between the O2 and the O3 runs has also increased the rate of event detection.  LIGO will be down for a long time between the O3 and the O4 runs.  The currently scheduled starting date for the O4 run envisions a 27 month gap.  The gap is so long because the upgrade will be much more extensive.

Each upgrade increases the sensitivity.  That means that events similar in size to currently detectable events can be detected further out.  Since a "cube" law is involved, a 10% increase in sensitivity translates into a 33% increase in the volume covered.  Also, smaller events that happen within the old volume can now be detected.  The difference is not as dramatic, but it should result in still more events being detected.

So LIGO started out as what appeared to be a boondoggle.  For a long time it ate lots of money while producing no science.  But the project did a one-eighty in 2016 when that spectacular discovery of the first event was announced.  The discovery of the second event didn't strike the public as nearly as spectacular.  But in many ways it was more important.  It proved that the first event wasn't a one-off.

Unfortunately, the public saw not much difference between the first event and the second, so they started tuning out.  And the public was treating each new event as routine a long time before LIGO got to the 56th one.  And routine is not newsworthy.  So, the press has been checked out since 2018.  It is possible but unlikely that O4 will produce a result that is spectacular enough to put LIGO on the front page again.

That is sad.  The quality of the science is increasing by leaps and bounds.  A big reason for this is the large pool of events, the very thing that makes the whole enterprise boring to the public.  And the O4 run should make things worse at generating buzz by producing data much more quickly than any previous run.

But more data is good for science.  Many more events means that comparisons can be made and patterns can be confirmed or disproven.  There is lots more data to use to test theories against.  Most theories will be found wanting but that's okay.  It's how science works.

The practical effect of something as exotic as gravitational waves can not be predicted.  No one knew that the time of its development that an obscure and insanely difficult physics theory called Quantum Mechanics would prove to be the foundation upon which all the integrated circuits that power all of our modern electronic devices are built.

Some theoretical work, and at this point LIGO is all about the theoretical, never seems to lead to anything practical.  But time after time, something wildly theoretical and of no apparent practical use, ends up allowing us to go from "why are people we don't care about and who live in an obscure corner in China getting sick?" to "we are now making life saving vaccines out of something that the public has never heard of called 'mRNA'."  And these mRNA vaccines are so powerful that they can stop a deadly world wide pandemic in its tracks.  And only a year separates these two events.

Saturday, December 12, 2020

60 Years of Sceince - Part 23

This post is the last in a series that dates back several years.  In fact, it's been going on for long enough that several posts ago I decided to upgrade from "50 Years of Science" to "60 Years of Science".  And, if we group them together, this is the twenty-third main entry in the series.  You can go to Sigma 5: 50 Years of Science - Links for a post that contains links for all the entries in the series.  I will update that post to include a link to this entry as soon as I have posted it.

I take Isaac Asimov's book The Intelligent Man's Guide to the Physical Sciences as my baseline for the state of science when he wrote the book (1959 - 60).  In this post I will review the section titled  "Fusion Power".  This is the last section of the last chapter in the book.  But there is an appendix.  So I will finish up by taking a look at what's in it. Since there is no more to the book, there is no reason to continue the series.  To work.

"Fusion Power" addresses a potential that has remained unfulfilled to this day.  Nuclear fission potentially provides access to such large amounts of power as to be almost unimaginable.  This potential has been turned into reality in the form of fission powered electric power plants that supply a substantial portion of the electricity we consume.

For all their problems, and in spite of the fact that they have not lived up to the potential Asimov and many others saw back in 1960, facilities of this type actually exist.  And they actually produce large quantities of electric power on a routine basis.

The same can not be said for fusion based electric power production.  But before we go into how this sorry state of affairs has come to be, let's review what Asimov had to say on the subject.  He starts out by noting that at the time of the book, physicists had been dreaming of harnessing nuclear fusion for twenty years.  Why the interest?    Because fusion is the process that powers our Sun.

The Sun is at roughly the midpoint of the time it will spend as a type of star called a Yellow Dwarf.  At some time in the future it will go through a series of metamorphoses that will turn it into a type of star called a White Dwarf.  That doesn't sound so bad, but for us it is.  A White Dwarf is tiny.  And it only puts out an infinitesimal amount of the heat and light that a Yellow Dwarf star like out Sun produces.

Even so, thanks to fusion, the Sun has been able to continuously produce massive quantities of energy for billions of years.  And it will continue to be able to do so for several billion years more.  Is there any better argument for the potential represented by fusion power?

Asimov correctly concludes that "[i]f somehow we could reproduce such reactions on the earth under control, all our energy problems would be solved."  The "under control" part is important.  At that time we already knew how to build a large "H" bomb.  It used fusion to create an amount of energy that was measured in megatons.  That's far too much of a good thing.

It's not the inefficiency of fossil fuel burning that is the problem. It is the side effects, the greenhouse gasses, etc.  Other non-nuclear options have problems that I have listed elsewhere.  Fission, the other "nuclear" option, has turned out to have problems that I have also addressed elsewhere.  But, assuming it could be controlled, and assuming little or no radioactivity would be generated, a reasonable assumption, then fusion based power generation would be a wonderful thing.

Asimov opines that fusion power would produce no radioactive waste.  This is actually an open question.  Some designs produce no radioactive waste.  Others do.  But even the designs that do produce radioactive waste look like they would produce far less radioactive waste than a fission based power plant.  He also notes that pound-for-pound fusion produces 5-10 times more power than fission.  So what's the hold-up?

He postulates the development of a fusion reactor based on Deuterium.  It is far rarer than regular Hydrogen.  But, as he notes, traces can be found in regular ocean water.  If efficient extraction processes can be found or developed then the fuel supply becomes effectively unlimited.

Deuterium has long been a subject of interest to nuclear physicists.  It is much easier to induce it to fuse.  The easy way to think of the problem is in terms of temperature.  Deuterium requires super-high temperatures to induce it to fuse.  But regular Hydrogen requires ultra-high temperatures, temperatures far higher than Deuterium requires.  Both regular Hydrogen and Deuterium are non-radioactive.  Putting it all together, Deuterium seems like the smart way to go.

Asimov then goes on to practical considerations.  With fission, physicists already had a starting point when it came to figuring out how to control it.  The "nuclear pile" they had built while figuring out how to build an "A" (atomic - fission) bomb provided a working example of a small, controlled, fission environment.  The problem is that there is no pile-equivalent that was developed along the way to the creation of a successful "H" (Hydrogen - fusion) bomb.

All "H" bomb designs use an "A" bomb as the mechanism necessary to initiate the fusion reaction.  No one ever figured out a half-measure way to get the job done.  So the developers of a fusion based power plant had to start from scratch.

Asimov whined about a lack of effort when it came to fusion reactor design.  This might have been true at the time.  But the problem has since received a large and persistent amount of attention.  Asimov lays out the two big problems.

The first problem is achieving super-high temperatures.  He estimated that reaching 350 million degrees would be necessary.  That's no problem in the vicinity of an exploding "A" bomb.  But we want to be able to do it in a relatively normal building that is situated relatively close to homes and businesses.  His estimate turned out to be way to high.  But millions of degrees are certainly necessary.

The second problem is holding everything together long enough to extract the power and turn it into electricity.  An "H" bomb literally blows itself apart in a small fraction of a second.  A practical fusion power plant must be able to produce power steadily for minutes, hours, days, weeks, even years.  Asimov tackles the first problem first.

His suggestion is a magnetic bottle.  A donut shaped cavity is evacuated.  Deuterium is inserted and an extremely strong magnetic field is applied.  For reasons I am not going to get into, this mechanism can heat the Deuterium to extremely high temperatures.  This causes it to turn into a plasma.  I am also going to skip most of the differences between gases and plasmas.

I am also going to mostly skip over the fact that we took a vacuum and then added a gas.  Doesn't that ruin the vacuum?  It doesn't if only a small amount of gas is inserted.  And it turns out that a plasma acts effectively as if it is a series of wires.  So we can use magnetic fields to "pump" lots of energy into it from a short distance (a few feet) away.  That heats it up.

The fact that we are doing this in a vacuum means that, if we ae clever enough, we can keep the super-hot plasma from ever touching the cold walls of the donut.  This means that the plasma can stay hot as it not transferring any energy from itself to the cold walls.  Conversely, the walls can be kept at something like normal temperatures because they don't make contact with the super-hot plasma.

And it turns out that all of this works.  But only to a certain extent.  No one has been able to get a device to heat a plasma to a sufficiently high temperature and then keep it there for any length of time.  One unexpected problem turns out to be plasma instability.  The plasma starts forming waves.  And those waves keep getting bigger and bigger.  They quickly get big enough to mess everything up.  The temperature crashes, or something else goes wrong, and the whole thing quickly "quenches".

Asimov then moves on to the second problem.  The trick here is that plasmas conduct electricity.  That means that you can steer them with magnetic fields.  This technique is called "magnetic confinement".  Scientists were having some success with magnetic confinement at the time the book was written.  They have since had much more success.  But "plasma instability", the "wave" business I discussed above, has limited the degree of success.  If the plasma instability problem could be fixed then magnetic confinement would work just fine.

Since the time of the book a Russian idea called a Tokamak has become the leading candidate for the best design.  To the untrained eye it looks pretty much like the donut I have discussed above.  But the subtle differences apparently help a lot.  Many design ideas have been tried since the '60s and failed.  The current leading candidate is called ITER.  It is a European led initiative that is based on a Tokamak.

Many billions of dollars have been sunk into ITER.  It is years away from completion so we are many years away from learning how well it works.  And it is a "proof of concept" project.  If it works then a "new and improved" design will be needed.

It will be based on lessons learned from the current ITER.  This follow-on device is supposed to be the first device that can actually produce electricity.  And, if that design works but each device constructed according to that design costs ten or twenty billion dollars to build, then fusion based power production may never pencil out.

There are lots of alternatives to the ITER that some laboratory or another is tinkering with.  Funders have decided to go pretty much all in with ITER. So all these other ideas are starved for cash and operating on a relative shoe string.  So they tend to poke along.  But, if one of them happens to  produces spectacular results then it may displace the ITER/Tokamak design as the front runner.  Don't hold your breath.

In Asimov's time, people were pretty optimistic that fusion power could be pulled off.  But that was sixty years ago.  Since then a lot of designs have come and gone.  And many billions of dollars have gone.  And we are still a very long way from a practical and cost effective device.  Or even one that works at all.

That's where the main part of Asimov's book ends.  So, let's finish up by looking at the appendix.  It is titled "The Mathematics of Science".  It is divided into two sections, "Gravitation", and "Relativity".  Asimov confined himself to a little simple arithmetic for the main part of the book.  Here, he relaxes that restriction somewhat.

You can understand Galileo's take on gravitation by moving on from basic arithmetic to High School algebra.  But before Asimov dives into that he steps back to make a few general observations.

He credits Galileo for the transition from a "qualitative" approach, just describing what's going on in sufficient detail for someone else to be able to recognize it, to a "quantitative" one.  In this latter approach it is important to also be able to measure things with as much precision as can be managed.

That is much easier to do now, than it was then, Asimov notes.  Take time.  There were no clocks capable of more accuracy than a sun dial available to Galileo.  He started out timing things by counting his pulse.  But keeping your pulse even is almost impossible to do.  Galileo knew that so he tried to compensate by devising various water clocks.  I am going to skip over the design details but note that none of his designs was completely successful.

And he had no way to accurately measure very short time periods.  Here, he came up with a trick that worked very well.  Instead of dropping something he rolled it down a ramp.  The shallower the ramp the longer it took the object to roll down the length of the ramp.  This, in effect, slowed things down enough that he could measure things accurately with the tools at hand.

And what he found was that a ball rolling along a flat track maintained a roughly constant speed.  He attributed the minor amount of slowing to friction and decided that, in the absence of friction, the speed would be constant.  This can be represented by the simple algebraic equation "s=k".  "s" is the speed of the object and "k" is some constant that depends on circumstances.  We have now dipped our toes into algebra.

This observation later became the foundation of "Newton's first law of motion".  Newton generalized what Galileo had done, resulting in "v=kt".  Here "v" is a more complicated concept than "s".  "v" (velocity) incorporates the concept of speed but it also incorporates the concept of direction.  So any change in speed, or direction, or both, means that "v", the velocity, has changed.  "k" is our old friend a constant, and "t" is time, that thing Galileo couldn't measure very accurately.

Newton postulated that, when it came to gravity, velocity would change at a constant rate as time passed.  He further postulated that there was something called a "gravitational constant".  So, when applied to velocity in a gravitational field the equation became "v=gt", where "v" and "t" are as before, but "g" is a gravitational constant.  This is still pretty simple algebra, but it is more complex than where we started.

It turns out that the value of "g" depends on some things.  But in a lot of circumstances "g" is a specific value that doesn't change.  And, it turns out that there is a "G", that really is constant.  You mathematically combine "G" with some other things and you can calculate the value of "g" for a specific circumstance.  Asimov goes into this in some detail, but I am going to skip over it.

I will note that he ends up discussing "sine" (shortened to "sin" in many contexts) a "trigonometry function".  Trigonometry ups the ante when it comes to mathematics by quite a bit.  Trigonometry is normally studied in High School.  But only "math track" students are exposed to it.  Any serious study of the physical sciences involves a knowledge of trigonometry and the ability to use its associated functions.

Digging deeper into Galileo brings us to an equation I don't know how to accurately reproduce in a blog post.  An unusual formulation that is accurate is "d=10tt".  Now "d" is distance" and "tt" just indicates "t" (our usual time) multiplied by itself.  This is usually indicated with a single "t" to which a small superscript "2" is attached.  This indicated that two "t"s should be multiplied together and that value used.  But I don't know how to get the blog software to do the superscript thing.  Anyhow, powers of numbers (multiplying them by themselves multiple times) is another increase in the mathematical degree of difficulty.

Asimov now completely abandons Galileo to focus on Newton.  He starts in territory that requires only High School mathematics.  "A=4 pi r r" (spaces added to improve readability) is such an equation.  Here "A" is area, "pi" is a stand in for the common symbol for the ratio between the circumference of a circle and its diameter.  But I don't know how to get my blog software to spit that symbol out.  And "r", which must be squared, in this case stands for the radius of a sphere.

A more interesting equation associated with Newton is "f=ma".  "f" is force, "m" is mass, and "a" is acceleration.  But why "m" and not "w" for weight?  Because weight is the result of a gravitational field.  As the strength of the field changes, the weight changes.

Newton wanted something that was gravity independent.  If you know the mass and the details of the gravitational field you can calculate weight.  If you know weight and the details of the gravitational field you can calculate mass.  Interestingly, if you know weight and mass, you can calculate the strength of the gravitational filed.

Finally, there are some situations where gravity is not involved but it is useful to know mass.  This led to an interesting question.  We measure weight when we put something on a scale.  What the scale actually measures is force.  Using the formulas discussed above we can translate that into mass.  Specifically, we can calculate the "gravitational mass" of an object in this way.

But the "f" in "f=ma" doesn't need to be a force associated with gravity.  If we know the "f" and the "a" we can calculate the "m".  In many situations not involving gravity, what we are calculating is called "inertial mass".  Einstein asked the question, "is the inertial mass of an object always the same as its gravitational mass"? 

It turns out that there is no effective difference if "im / gm = k".  In other words, if the inertial mass ("im") of an object divided by the gravitational mass ("gm") of the same object always yields the same constant then the two are indistinguishable, so we might as well assume that they are the same thing.

Scientists, including Einstein, have looked for instances where the "im / gm" ratio varies.  So far they haven't found any.  So, until proven otherwise, scientists assume that "im" equals "gm".  If you can find an instance when "im" does not equal "gm", it's a safe bet that there will be a Nobel Prize in your future.

Asimov doesn't move on to the next obvious topic.  High School math is adequate to cover what is called "statics", situations where everything is static, i.e. unchanging.  But what about "dynamics", situations where things are changing?  For that you need calculus.  Newton wanted to study dynamic situations, celestial bodies orbiting other celestial bodies, objects falling in a gravitational field of varying intensity, things like that.

He literally had to invent calculus in order to perform the computations and analysis he was interested in.  The calculus he invented was limited.  As soon as he had developed as much of it as he needed to be able to answer the questions he was interested in, he stopped working on calculus and moved on to other things.

Fortunately for us, a German named Leibnitz developed calculus at the same time.  His version did not suffer from the limitations that Newton's did.  He was a mathematician, so he kept adding improvements and extensions for as long as he could.  In the end his version covered a lot more mathematical territory.

Engineers often use the Newtonian version because it is simpler and well suited to many of the problems they routinely encounter.  Everybody else uses the Leibnitz version.  And it has long since been demonstrated that in the areas where they overlap, they are both completely equivalent.

On to "Relativity".  Relativity consists of "Special Relativity", the version Einstein developed in 1905, and "General Relativity", the more complicated but more all encompassing version he developed in 1915.

But before going here Asimov spends a lot of time on the Michaelson-Morley experiment.  This was an experiment done in 1887 that attempted to measure the direction and speed (i.e. velocity) of the Earth as it travelled though space.  The experiment depended critically on the speed of light being a constant.  At the time no one could imaging things being otherwise.

The calculation that would turn the resulting measurements into the velocity of the Earth involve some fairly complex algebra.  But they were well within the capability of a High School student who had completed the "math" track.  I am going to skip over them.  For one thing, they are complicated.  For another thing, we don't need them.  The experiment failed.  The result said that the Earth was not moving through space at all.

We know now and they knew then that "it moves", to quote Galileo on the subject.  If nothing else, it circles the Sun once per year at a distance averaging 93 million miles.  The necessary "orbital velocity" is easily calculated.  That number was far higher than the sensitivity of the experiment.

The "it's not moving" result was shocking.  So lots of people tried unsuccessfully to find a flaw in the experiment's design.  And others reproduced the experiment and got the same result.  Einstein was the first one who was willing to say that "what's going on here is that the speed of light is variable".

In fact, Special Relativity follows directly from the idea that "no matter how you measure the speed of light, and no matter what circumstances you measure it in, as long as there's no acceleration involved, you will always get exactly the same answer".

Fitzgerald had already done some of the work.  He came up with a formula that calculated exactly how much things needed to "contract" to keep the measured speed of light constant.  Fitzgerald's equation included "c" the speed of light.  So the degree of contraction could be related to the number you got when you divided the speed of the object by "c", the speed of light.  Now, as a speed, "c" is very large.  It's conventionally quoted as 186,000 miles per second.

A fast car might go 100 or 200 or even 300 MPH.  That's a tiny fraction of "c".  A commercial airplane flies at just over 500 MPH. A high performance plane might go 2,000 MPH.  Both are only going a tiny fraction of "c".  The speed of pretty much anything we encounter in our day to day experience always amounts to a tiny fraction of "c".  Even a rocket going 20,000 MPH, what we would normally think of as being super-fast, is still crawling along when measured against "c".

Fitzgerald's equation said that for anything going a small percentage of the speed of light, the amount of contraction taking place would be miniscule.  Even if you were going 100,000 miles per second, roughly half the speed of light, the effect would be relatively modest.  You had to be going at 90% or 95% or 99% to see really large effects.  And if you could get to 99.9% or 99.99% then some really strange things would happen.

The fact that the effect was infinitesimal at "normal" speeds was why nobody noticed it, Einstein argued.  Also, note what happens if something is traveling at exactly "c".  Everything either goes to zero or infinity.  This is the basis of the statement that you can't reach the speed of light no matter how hard you try.

There is mathematics that says what might happen if you cold find a way to go faster than "c".  But, if you translate the results into the real world, you get complete nonsense.  And, before you ask, if you succeeded, the things that would happen would instantly render you dead.  (They would also destroy any instrumentation or machinery too, so whatever else you might want to try wouldn't work either.)

Special Relativity works the same as Newtonian Mechanics, in terms of the math required.  You can solve static problems using High School algebra.  You need calculus to solve dynamic problems.  But calculus is all you need.

Asimov does not discuss General Relativity, the version that can handle things when accelerations are involved.  There is a good reason for this from a mathematical perspective.  It took Einstein a decade to go from Special to General Relativity.  He spent several years trying to get anywhere at all.

He finally came up with a couple of key ideas.  But he quickly realized that, if he was going to handle these ideas quantitatively,  he would need to learn a type of mathematics called Tensor Calculus.  He had to devote the best part of two years to doing this.  Fortunately for Einstein, Tensor Calculus had already been invented.  He didn't have to invent it.  He just had to learn how to do it.

All I'm going to say about Tensor Calculus is that it is way harder than regular calculus.  Imagine that you are barely scraping by in High School algebra.  If that's an accurate measure of your mathematical ability then imagine how hard it would be to learn regular calculus.  That comparison gives you a feel for the difference in difficulty that lies between regular calculus and Tensor Calculus.

But the good news is that once Einstein had mastered Tensor Calculus, he succeeded in formulating his ideas in terms of Tensor Calculus, and then using it to compute results.  And, he showed that his results matched reality.  He was able to show that General Relativity provided the solution to several puzzles that had been bedeviling Astronomers.

He even famously made a prediction involving a star appearing to move when the light from that star came close to grazing the surface of the Sun.  The star didn't move, but the "gravitational lensing" caused by the gravitational field surrounding the Sun caused the path that light from the star took to bend on its way to Earth.  And that made it appear that the star had moved.

And with that, we're done.



Saturday, November 21, 2020

60 Years of Science - Part 22

This post is the next in a series that dates back several years.  In fact, it's been going on for long enough that several posts ago I decided to upgrade from "50 Years of Science" to "60 Years of Science".  And, if we group them together, this is the twenty-second main entry in the series.  You can go to https://sigma5.blogspot.com/2017/04/50-years-of-science-links.html for a post that contains links to all the entries in the series.  I will update that post to include a link to this entry as soon as I have posted it.

I take Isaac Asimov's book The Intelligent Man's Guide to the Physical Sciences as my baseline for the state of science when he wrote the book (1959 - 60).  In this post I will review two sections, "Nuclear Power" and "Radioactivity".  Both are from the chapter "The Reactor".  This is the last chapter in the book.  So, the end of this series is neigh.

The book was written in the middle of the Cold War.  Then, MAD, Mutual Assured Destruction, the ability of either the U.S. or Russia to start a nuclear conflagration that would literally bomb both countries "back to the stone age" was something in the forefront of people's minds.  But nothing happened.  The Cold War ended peacefully with the breakup of the Soviet Empire.

And various crises have since come and gone.  And wars have come and gone or, in some cases, lingered for what seems like forever.  And countries as stable as the United Kingdom and as fringe as North Korea have gotten "the bomb".  In all this time no one has exploded a nuclear weapon in anger.  So, most people now spend little time thinking about them.

Things were different back then.  Nuclear weapons, and the possibility of nuclear war, was a pressing concern.  This scared the shit out of people, and legitimately so.  As a result there was a real yearning for an alternative, an "atoms for peace" program of one sort or another.

But Asimov starts a little earlier.  He notes that there was a legitimate race to be the first to develop an Atom Bomb.  The Nazis did have a legitimate program that Hitler hoped would produce he could use.  And no one doubted that he would use it if he had it.

It turns out that we now know that they never even got close.  But that became clear only after the War was over.  In the mean time, this legitimate concern was part of the justification for moving forward rapidly with a U.S. program, a program that eventually succeeded.  (Many other countries helped.  Principle among them was the U.K.  But the U.S. provided all of the money and most of the resources.)

Against the background that, then and now, the U.S. is the only country to ever explode a nuclear weapon in anger, there was a yearning to balance the bad with the good.  And the most obvious good was to harness the "power of the atom", in this case nuclear fission, to produce power.  This power was first used to propel ships.  But it could also be used to produce electric power.

Both of these technologies emerged from "Project Plowshare", named for the biblical quotation about "beating swords into plowshares".  But producing power that could be harnessed was not the only idea Plowshare explored.  Another was to use atomic bombs for earthmoving.  The obvious candidate was a canal from the Atlantic to the Pacific that would be dug by exploding a series of Atomic Bombs underground.

As a proof of concept a bomb was actually exploded underground in Alaska in an attempt to create an artificial harbor.  Another possibility was to use it for oil drilling.  The reason that you haven't heard of these and other ideas is that they turned out to be far more trouble than they were worth.  They were all abandoned.  Some persisted after Asimov's book was published.  But not for long.   The only Plowshares idea that turned out to have any legs was the nuclear reactor.

Demonstrator nuclear reactors of various kinds started popping up within a few years after the end of World War II.  But, as I have noted elsewhere, it costs a lot of money to come up with a design.  It costs even more money to turn the design into a working device.  That made commercial interests reluctant.  The U.S. Navy, on the other hand, was not reluctant.  As a result, the first nuclear reactor put to practical use was put to use powering a Navy submarine.

The thinking was that submarines are vulnerable on the surface but safer underwater.  And a power plant that required an ample supply of oxygen, as any kind of petroleum based engine does, demands considerable surface time.  Nuclear power requires no oxygen.   And, once a nuclear power plant was developed, a recent conventional submarine design was quickly reworked to make use of it.  The result was the "Nautilus", named for the submarine in Verne's 20,000 Leagues under the Sea.

It was so successful that almost all U.S. Navy subs that have been built since have been nuclear powered.  They can easily stay underwater for 6 months straight.  The biggest ships in the U.S. Navy's inventory were also soon adapted to nuclear power.  Since the '60s, all large Aircraft Carriers are nuclear powered.  These ships have a large fuel budget.  But it is for the planes they carry and not the ship itself.

Efforts to use nuclear power in other ship types has failed.  A nuclear powered cargo ship was built.  It was a technical success but a practical failure.  Everything worked just as it was supposed to.  But it was barred from most seaports for political reasons.  These same political reasons are the reason no other ship type has been attempted.

Most of this happened after Asimov's book was finished.  He spends some time on the Nautilus and mentions several other nuclear powered vessels.  For instance, the keel for the "Enterprise", the first nuclear powered Aircraft Carrier, had been laid down in time for that information to make it into the book.  But she had not yet entered service.

As CVN-65, she entered active service in 1961.  After over fifty years of active service, she was decommissioned in 2017.  Construction of a replacement of the same name, CVN-80, is scheduled to begin in 2022.  CVN-80 is scheduled to enter service in 2027 or 2028.

The first civilian nuclear power plant was built by the Russians in 1954.  The U.K. followed in 1956.  The U.S. joined the club in 1958.  At the time coal fired power plants were cheaper to build and cheaper to operate.  It was hoped that as nuclear power plant construction and operation moved down the learning curve, they would eventually become the cheapest option.

We now know that was never going to happen.  Outside of the Soviet sphere of influence, most designs differed little from each other.  They also differed little from the design used to power the Nautilus.  At the time Asimov wrote his book it was believed that Uranium was hard to find.  It turned out that there was a learning curve when it came to finding Uranium.

Uranium is now known to be plentiful.  It is also known to follow the same rule that applies to pretty much any commodity that is mined.  The higher the price, the more ore deposits there are that can be mined economically.  We are not going to run out of Uranium to mine any time soon.

The construction of many plants of similar design should have driven construction costs down.  But it didn't.  The anti-nuclear people got more and more effective.  They forced regulators to pile on more and more requirements that were supposed to improve safety.  They didn't.  What they did do was to keep pushing construction costs higher and higher.

Three Mile Island, followed by Chernobyl, followed by Fukushima have caused the pressure to only increase.  I have plowed this territory extensively elsewhere so I am not going to go over it again.  Suffice it to say that nuclear power does not now, and doesn't in the near future, look to be a substantial contributor to new electric power generation.

Asimov includes a schematic diagram of a "gas cooled" nuclear power plant.  It describes a design that is more sophisticated than the one used in most nuclear power plants operating today.  Instead of being "gas cooled", they are "water cooled".  But, other than the details of the cooling method, so little has changed since that it accurately portrays how most nuclear power plants work to this day.

A Uranium shortage was them a serious concern.  Asimov responded to this concern by noting that "breeder reactors", reactors that can covert the common U-238 isotope of Uranium into Plutonium, effectively multiply the amount of nuclear fuel available by many times.  Only minor design changes need to be made to turn a Uranium fueled design into a Plutonium fueled design.  He also discusses Thorium as a third alternative.

None of this went anywhere after the book was published.  The primary reason was the discovery that there actually was a lot of Uranium around.  Safety and proliferation issues doomed Plutonium.  It turns out to be relatively easy to harvest reactor grade Plutonium and turn it into a bomb.  The risk associated with Plutonium, and other concerns I am going to skip over, means that breeder reactors are only used in military programs designed to create fuel for bombs.

I am not familiar with the reasons Thorium never took off.  I suspect that it too was doomed by cheap and widely available Uranium.  But I don't actually know for sure.  On to "Radioactivity".

Asimov characterizes radioactivity as a new threat.  He justifies this on the basis that naturally occurring radiation is usually of a pretty low intensity.  High intensity radioactivity he associates with new man made activities like Atom Bombs.  He is correct in the sense that "the bomb" made people acutely aware of radioactivity.

Scientists had known about if for about fifty years by then.  But outside of certain scientific circles it was pretty much unknown.  To his credit he does discuss early radiations induced deaths and illnesses.  Two early victims were Marie Curie and her daughter.  For a while X-Rays were considered completely benign.  But that slowly changed.  Now, of course, safety protocols are routinely followed in places like dentist offices.

Dots of a mixture of Radium and phosphors that would light up were applied to watch dials to make watches easier to read in the dark.  The work was done by women using small brushes.  They would often lick the brushes as they worked.  This resulted in horrible cancers of the face and mouth, and sometimes death.  This practice was outlawed but I don't know whether this happened before or after Asimov's book came out.

Asimov speculated on whether enough radiation would be unleashed to cause widespread harm.  We now know that the answer is no.  But even very small amounts of radiation can be easily measured.  This has allowed scientists to perform some very unusual "tracking" experiments.

Oceanographers have been able to accurately measure the amount of radiocarbon in ocean water.  It spiked during the short period when extensive above ground bomb testing was occurring.  The sharp edge between radiocarbon enhanced water and water containing normal amounts allows them to calculate just how "old" the water was.  That is, how long it's been since the water was at the ocean's surface.

Another interesting development was the discovery of natural nuclear reactors.  Chain reactions depend of the concentration of Uranium being unusually high.  But there are natural events that concentrate Uranium.  And in some cases, these have resulted in chain reactions taking place.  We know this because this situation leaves distinctive isotope profiles behind.

Concentrations never reached the levels necessary to cause a nuclear explosion.  But it never occurred to anyone to think that even a low level chain reaction was possible.  That is, until someone accidently stumbled across the first one.  Since then, many more have been found.

Asimov quickly moves on to a discussion of the mechanics of radioactive decay.  These are subjects I have already covered elsewhere.  He just hits the highlights.  But a lot was known at the time and far more is now known.  But it is detail.  The main picture is clear and hasn't changed in the sixty years since the book was written.

He discusses the concept of a "decay chain".  This isotope decays into that isotope, which then decays into some other isotope.  He also notes that an isotope may decay in several ways.  But in all cases the probabilities are fixed.

He moves on to "half life", a subject which I have already discussed extensively.  From there, he goes on to note that some kinds of radiation are deadlier than other kinds.  The converse of this, which he doesn't discuss, is that it is easier to create an effective shield against some kinds of radiation than it is to create one against other kinds.

He then segues from the fact that everything is radioactive to the subject of "background radiation".  This is another topic I have already treated.  He notes but doesn't go into detail on the idea that background radiation can contribute to evolution.

DNA had just been discovered.  We now know that radiation can damage DNA.  This can result in mutations.  A mutation can be either beneficial or detrimental.  Over time, the beneficial mutations cause species to evolve.  But there are cellular mechanisms for repairing DNA damage, regardless of the cause.  And their are many other ways to cause damage.

Other big causes of mutations are transcription errors, reading errors, and the like.  DNA gets duplicated.  The duplication process is not 100% accurate.  Various processes "read" DNA.  As an example, the cell manufactures thousands of different proteins.

The blueprint describing the specifics each of the many different proteins a cell manufactures is found in the DNA.  A process that is different from, but related to, the duplication process is used to read the DNA.  But the information found that way is used much differently.

Instead of being used to duplicate the DNA itself, a translation process is used to drive a protein assembly process.  DNA provides the details that determine the order and type of the subunits that snap together to make each specific protein.  An error in this process causes the wrong protein to be made.

It is not a wonder that things go wrong with these cellular processes.  What is a wonder is just how infrequently they do go wrong.  It is thought that cancer is caused by key cellular mechanisms going consistently wrong.  Scientists are attacking cancer by figuring out how to get these broken processes back on track.

Various efforts are now under way to reclassify cancers.  The current methods of classifying cancers depend on the symptoms or what organ is affected.  The new method depends on classifying what cellular mechanism goes wrong and how it goes wrong.  This may lead to a single cure that is successful against many cancers, not just one or a few.

This deeper understanding of DNA, the way radiation damages DNA, and all that follows has taken place since Asimov wrote his book.  So, let's get back to it.

He moves on to the "nuclear waste disposal" problem.  This is also something I have discussed extensively elsewhere.  Before moving on I will note that he assumes that the nuclear power industry will grow rapidly.  He also assumes it will eventually become quite large.  That would have resulted in a large amount of nuclear waste.  But the industry did not ever grow very large.  So, the waste disposal problem is actually quite modest.

And, since he overestimates the size of the problem, he ends up taking off on what now look like tangents.  One of them involves building devices that produce small amounts of power for long periods of time.  They work just fine.  But they have not gone into general use due to the public's fear of radiation.  They have only found one use.

We routinely send space missions to the outer solar system.  These missions need power.  The standard solution is solar panels.  Various Mars rovers, the International Space Station, and all manner of other space gadgets, use solar panels for power very successfully.

But the farther from the sun, the less bright sunlight is.  And that means you need giant arrays of solar panels to produce the necessary power.  Queue the RTG, the Radioisotope Thermoelectric Generator.  It is based on the SNAP device Asimov discusses.

Modern RTGs use Plutonium for fuel.  They are radioactive enough to be dangerous.  So they are often put on the end of a boom that distances them from the bulk of the spacecraft.  RTGs power both Voyager spacecraft, now the two most distant man made objects.  One powers the spacecraft that did the flyby of Pluto.  (BTW, that spacecraft is still working fine.)  Their other successes are too numerous to list.  But this application is the only one where "Isotope Power" is used routinely.

Asimov discusses various other attempts at the peaceful use of radioactive materials.  There has been some successful use of radioactive materials in medicine.  That success continues but it is modest.  The other things he discusses never ended up going anywhere.  The public fear of radioactivity eventually blocked any chance of success they might have otherwise had.

He then returns to how to dispose of radioactive material.  It would be nice if the topic had advanced productively since Asimov's day.  But it hasn't.  The same old options are still on the table.  The same arguments are still advanced against each option.  The fact that radioactivity poses no unusual danger, and the fact that we are talking about a very small volume of material, are both still being ignored.

He then moves on to radioactive fallout and the fact that very tiny amounts could be detected, even back then.  He concludes from this that "it is virtually impossible for any nation to explode a nuclear bomb in the atmosphere without detection".  That truth eventually became self evident.  It led to the "Nuclear Test Ban" treaty, which outlawed above ground testing.

At the time that left a loophole.  Countries cold explode bombs in caverns below the ground.  But seismology has grown in sophistication by leaps and bounds since Asimov's time.  It was then possible to detect the underground detonation of a medium or large sized nuclear weapon.  But what about a small one?

In Asimov's time it was thought that such a detonation stood a good chance of going undetected.  But, as I said, seismology has since gotten much better.  It eventually became apparent that even the detonation of a small nuclear weapons would be detected.  There was some nonsense thrown up postulating that there were circumstances under which a detonation could still go undetected.

But the arguments were nonsense and this eventually became apparent.  There is now a treaty banning underground nuclear explosions.  But the U.S. and a number of other countries have not signed it.  Most conspicuous among the non-signers is North Korea.  But that hasn't stopped all of their underground nuclear tests from being detected.

No one has succeeded in concealing an underground nuclear test and no one will.  But that doesn't mean that a country won't develop a nuclear weapon and test it.  North Korea did just that.  It just means that, if they do so but try to keep it a secret, everybody will still find out what they did.

Asimov then launches into a long discussion of the isotope Strotium-90.  It is highly radioactive.  It is particularly dangerous because it is readily absorbed into the bones of growing children.  In this situation, it doesn't take a lot to constitute a dangerous amount.

Another highly radioactive isotope is Iodine-131.  It is particularly dangerous because it is taken up and concentrated by the thyroid gland in the neck.  Again, as a result it doesn't take a lot to constitute a dangerous amount.  Asimov does not discuss Iodine-131.

You will typically see a lot of press coverage of Strontium-90 and Iodine-131 whenever there is an event that releases a lot of radioactive material.  These two materials were discussed extensively in conjunction with the Fukushima nuclear disaster, for instance.  Now you know why they rightly attract so much press attention.

And on that cheery note, . . .

Thursday, November 12, 2020

Cars - The State of Play in 2020

 I like to periodically return to subjects to see how things have evolved since I last wrote about them.  The most obvious example is my long running series, "60 Years of Science".  But it is far from the only example.  This post brings together updates that are joined by the fact that they all have to do with cars.  Let me start with self-driving cars.

I last opined on this subject three years ago.  Here's the link:   https://sigma5.blogspot.com/2018/01/robot-cars-now-new-and-improved.html.  For a long time the conventional wisdom was that Autonomous Vehicles, or AVs for short, would arrive in 2020.  Well, have they arrived?  Nope! The "wisdom" referenced in the post consisted of, in part, an article in Science, the premier scientific journal published in the U.S.  (It is #2 in the world behind the British journal, Nature).

A December, 2017 article in Science opined that AVs would appear "somewhere over the rainbow".  Elsewhere in the same article the author described widespread use of AVs as "still decades away".  Ouch!  Then, as now, I find that prediction too pessimistic.  So, where do we stand now that we are at the end of 2020?

Well, then and now there are several companies working on the subject.  Waymo, the Google subsidiary, is generally assumed to be in the best shape.  But, like everybody else, it is still running "demonstration" and "pilot" projects.  One problem is that a car that was part of one of these demonstration/pilot projects managed to kill a lady in Arizona.  It wasn't part of a project run by Waymo but the death put a pall on the whole industry.

Several people have also died while driving Tesla cars in "Autopilot" mode.  A Tesla car operating in Autopilot mode is not a full up AV.  And the drivers were allowing Autopilot to drive their cars under conditions where they were supposed to be closely monitoring it.  Instead, they adopted a "hands off" attitude, literally.  But these are technicalities that don't influence the thinking of the general public.  The public is extremely concerned about the safety of AVs.

And that has caused everybody to go slow, everybody, that is, except Elon Musk, the CEO of Tesla.  He says that the new iteration of Autopilot will be capable of autonomous operation.  Details are skimpy so nobody knows quite what he means.  The general consensus is that the Tesla Autopilot feature lacks many of the capabilities necessary for true autonomous operation.  So, most people are in "let's see what he actually delivers" mode.

And, there is a great deal of confusion.  The SAE (Society of Automotive Engineers) has defined various "levels" from fully manual to fully autonomous.  Their "level 5" is fully autonomous.  And everybody agrees with the criteria that they have laid out.  That's not the source of the confusion.

The source of confusion is that people expect a level 5 vehicle to be able to operate completely autonomously in all conditions.  They expect to operate in the daylight and at night.  They expect it to operate in good weather conditions and bad.  They expect it to work on city streets, on freeways, on country roads, and even off road.  An AV capable of that level of autonomy truly is decades away.

But the ability to safely and consistently handle in all those conditions is not necessary in order for large numbers of AVs to be on the road and operating successfully.  There is general agreement that off road is the hardest to manage.  So, it will be the last to appear.  On the other hand, there is some disagreement as to whether city driving or freeway driving is the easiest to manage.

One's first impression is that freeway driving is the easiest.  And that's true in most conditions.  But there is a famous example of an AV test car being unable to exit a Phoenix freeway.  The other drivers on the road ganged up on the test car.  They repeatedly blocked it from finding a slot it could use to move into the exit lane.

Trouble exiting, or trouble changing lanes when you need to, is a common occurrence.  If there's a lot of traffic it is hard to find an opening without some cooperation from other drivers.  The solution to a lack of cooperation is to "barge" and force an opening.  But that can be dangerous.

With a little jockeying, and possibly some hurt feelings, it can almost always be pulled off.  But it may involve a game of "chicken" and that's not "safe and sane" driving.  It often involves judging the psychology of the other drivers.  You need to pick on someone who will back off rather than remaining assertive.

It is possible to "program" that kind of behavior (all but the psychological part) into the AV system.  But companies don't want to do that.  Besides being risky, it is bad for public relations.  Self driving cars adhere to speed limits and follow all the other rules of the road.  That makes them far more timid than the average driver.  But it is also the posture that is most reassuring the public.

I don't know if the AV companies have solved the "Exiting" problem.  The incident I heard about happened a couple of years ago so they have had time to work on it.  The solution may have been as simple as going with unmarked vehicles.

But many autonomous designs feature various pieces of distinctive hardware poking out of the roof of the car.  Such a design obviates the need for the cars to be marked in order to be easily identifiable.  A car with a "taxi" bubble on the roof looks like a taxi regardless of whether it says "TAXI" on the side or not.

So, urban areas may turn out to be easier.  I'm sure the AV companies have gathered reams of data on this subject.  I expect them to first introduce AVs into whatever environment they think will be the easiest to get the cars to work in.

There have been some "fully autonomous" licenses issued by states, etc.  This allows companies to put AVs on the road in some places without having to have a test driver onboard.  But nothing of this sort has been rolled out on a scale large enough to attract the attention of the press yet.  (And it may be that COVID has been a major reason for the delay.)

I truly think that we will see some AVs in operation in limited areas by 2022.  The obvious choice is Uber/Lyft.  These companies know exactly where passengers are departing from and exactly where they want to go.  They can also give customers the option to opt in or opt out of a trip in an AV.  That allows them to keep things tightly controlled.

Uber and Lyft are very interested in AVs as they very much want to eliminate the cost of the driver.  And they have the technology that allows them to handle whatever constraints AV operations throw at them.  They can only handle a limited geographic area.

They can handle time-of-day or weather constraints.  They can handle passenger consent issues.  And they can handle changes in any or all of these constraints.

My thinking about AVs and Uber/Lyft mirrors almost everybody else's.  But so far, neither Uber nor Lyft have started so much as a pilot project to dispatch AVs for use by the general public in a limited area and under limited conditions yet.  I'm sticking to my "by 2022" prediction, but I have no special insight into this.

And that's how I expect the AV market to evolve.  It will start out only being used in small, limited ways,  If things go well then AV use will expand into more and different environments.  Eventually, the coverage will be broad enough to encompass a large number of trips but not all trips.  That's good enough.  I do think that it will be a long time before we see off-road AVs.  But that's okay.

Tesla's experience with Autopilot tells us a lot about how fast the psychology of the public can change.  A small set of drivers pushed it far beyond its actual limits.  That got several of them killed.  In spite of this, Autopilot is very popular.  It gets a lot of use, mostly in a responsible way.

But some people still push its use beyond what is safe.do in spite of the well documented risks.  With familiarity comes comfort.  And it doesn't take very long  People quickly came to trust Autopilot, in many cases more than was wise.  Right now, almost no one has any actual experience riding in an AV.  But, once they do, one trip will be enough for most people to become comfortable with the experience.

Next, I want to introduce a closely related subject.  Cruise Control got a lot more capable during the Obama Administration.  In older cars I have owned, all Cruise Control was capable of was maintaining a constant speed as the car went up hill and down dale.  The last car I bought was capable of a lot more.  And the Tesla Autopilot I discussed above is capable of still more.  All this is part of the path to a true AV.

But, as the capability of Cruise Control systems increased by leaps and bounds, lots of people figured out that it would be a good idea for these automated systems to communicate with each other.  Maybe the car in front could see an obstacle your car couldn't see.  And it is useful to know that a close-by car is going to change lanes or speed up or slow down or whatever.

The natural progression is for each car company to independently develop its own system then expect the other car companies to come to them.  (I saw this play out dozens of times in the computer business.)  Except companies adopting standards developed by other companies was never going to happen.  (It almost never did in the computer business.)

The solution is to have a "neutral" standard that everybody would use.  That way all suitably equipped cars could talk to other suitably equipped cars regardless of the make.  The specification is called "V2V" (vehicle to vehicle) communication.

The Obama Administration quickly put together an umbrella group to help this along.  The Government wouldn't set the standard.  They would just facilitate the industry coming together to set the standard.  That way it would be an "industry" standard not a "government" standard.  Better for everybody that way.

All this was moving along nicely when we had an Administration change.  As an Obama initiative, the Trump people wanted nothing to do with it.  Also, in spite of the fact that the government wasn't setting the standard, it smelled like "government regulation" and they were averse to that sort of thing.  So they shut it down.

Caught is the same net was the V2X initiative.  The idea was to extend the V2V specification so that it didn't just cover vehicle to vehicle communication.  It would instead be "vehicle to everything".  A smart stoplight would be able to tell oncoming vehicles when the signal was going to change.  Or it could inform the vehicle that a pedestrian had requested a "walk" cycle.

Anything that might improve things could be included.  Weather alerts could be sent out.  Construction areas could be signal cars what their extent was.  Warnings about ice on the road could be communicated.  Use your imagination.  This program was also effectively shut down.

We could be a lot further along at this point if standards had been adopted.  I'm sure the initiative will eventually come into being.  But the auto industry manufactures about fifteen million vehicles per year.  None of those vehicles are V2V or V2X capable at this point.  The time when most vehicles include V2V and V2X  capabilities will take just that much longer to arrive.  And the benefit will, therefore, take just that much longer to arrive.  So sad.  So unnecessary.

Okay.  On to my next topic, electric cars.  It turns out that I have never dedicated a blog post to this subject.  I'm pretty sure I have peripherally mentioned the subject.  But, in perusing the titles of all of the posts I have made, I don't see anything that is likely to have addressed the subject in any depth.  So, here goes.

In theory, electric cars are a great idea.  DC electric motors (all you really need to know is that this is the kind used in electric vehicles) are capable of providing 100% torque at 0 RPM.  To translate that into English, torque is a measurement of how hard the motor is pushing in an attempt to get the car to go faster.  0 RPM is the situation where you are stopped and you want to start going.  In short, DC electric motors are great if you love jackrabbit starts.

And being able to accelerate quickly (i.e. a jackrabbit start) is what people (mostly guys) use to decide whether a care is "powerful" or not.  Elon Musk made sure that Tesla cars accelerate quickly.  It was a great decision from a marketing point of view.

And there are lots of very powerful electric motors out there.  Diesel train locomotives aren't really "Diesel".  They are actually "Diesel-Electric".  The thing that is turning the wheels is actually an electric motor.  Nuclear Aircraft Carriers, the largest and heaviest moving objects in the world, use electric motors to turn their propellers.  So, if it is possible to build a powerful electric car, what's the problem?

The problem is the battery.  Engineers put wimpy electric motors into cars because a powerful electric motor can quickly drain the battery.  There is a direct tradeoff between a vehicle that feels powerful and responsive and a vehicle that goes a long way between recharges.  (You can try to get the driver to back off but drivers never do.)

If batteries worked well then there would be no problem using powerful electric motors in electric vehicles.  Car makers put powerful gas motors and big gas tanks into cars all the time.  That gives "gas" cars lots of power and lots of range.  Unfortunately, this approach is not possible when it comes to electric cars using current battery technology.

And it's not like we haven't known how to make electric cars until recently.  The "Baker Electric"  was a popular car in the early 1900s.  But its top speed is about 12 miles per hour and it doesn't go far between charges.  Why?  Because it uses "lead acid" batteries.  This is the type of battery that can be found in "gas" and diesel cars.  But it is both large and heavy in terms of how much energy it can store.

Modern hybrid and all-electric cars use "lithium" batteries.  They are similar to the battery in your mobile electronic device.  They are a big improvement over a lead acid battery.  But they still suck.  Lithium batteries are also very expensive.  The "gas tank" in a car costs a few dollars.  The battery pack in a hybrid or all-electric car costs many thousands of dollars.

The battery pack is much larger than a gas tank.  It is also much heavier than even a full gas tank.  And it can't propel a vehicle nearly as far as a tank full of gas can.  The manufacturers of hybrid and all-electric vehicles are forced to make trade-offs.  And none of their options are good.

A smaller battery pack is cheaper, takes up less space, and weighs less.  The lower weight improves the electric vehicle version of fuel economy.  Performance is not determined by the size of the battery pack.  Instead, the most important factor is the size of the electric motors.  This all sounds good so what's the problem?

The problem is range.  In the same way a small gas tank restricts range, a small battery pack restricts range.  Powerful the electric motors also restrict range.  They can drain the battery pack more quickly than small motors.  Of course, small motors translates to wimpy performance.

So vehicle makers go with the smallest battery pack (and often the smallest motors) that they think they can get away with.  Tesla cars come with a relatively large battery pack.  But that makes them expensive.  And, since Musk emphasized performance, if you drive a Tesla in "high performance" mode, it doesn't go very far before it needs a recharge.

You can get a lot of mileage between recharges with a Tesla.  But you have to put the vehicle into "economy" mode.  Then the vehicle delivers anemic performance.  And a Tesla is an expensive vehicle.  If you want to keep the cost down you have to go with a small battery pack.  That guarantees anemic performance.

And that has tilted the market towards hybrids.  These have a small gas engine and a very small battery pack.  The engine is used to recharge the battery pack on the fly.  If both the engine and battery pack are delivering power to the wheels, a hybrid can deliver so-so power.  On the other hand, if the battery pack has run down and only the power from the engine is available, then the car can barely get out of its own way.

But the combination delivers a lot of range.  Buyers have shown a marked preference for the extended range hybrids deliver over reasonably priced all-electric vehicles.  This trade off is forced by the current state of the art in battery technology.

Lithium batteries are far superior to lead-acid batteries.  But what is really needed is a battery that is as superior to a lithium battery as the lithium battery is superior to the a lead-acid battery.  Unfortunately, scientists currently have no clue as to how to create such a battery.

What I find surprising is that there is a role that all-electric vehicles can fill right now.  That role is with respect to delivery vehicles and the like.  As noted above, all-electric vehicles have a limited range.  But it is more than sufficient to satisfy the needs of these vehicles.  So no technological or other improvement is needed.  Why we don't see all-electric delivery vehicles all over the place is a mystery to me.

Drivers are concerned, perhaps excessively so, with the battery running down at an inconvenient time.  But delivery vehicles are used for short trips.  They don't rack up that many miles in a single day.  If the place where they are parked at night is equipped with fast chargers (chargers that run at 220 volts rather than the 110 volts that most household plug sockets deliver) then they can be recharged overnight.

As long as they have enough range to handle the number of miles these vehicles put in each day, an electric vehicle should work fine.  Fleet operators know all the statistics concerning how many miles per day their vehicles rack up.  And they rack up a lot of miles in a year so fuel costs are very important.  To me, it sounds like a perfect fit.

And there is an Uber/Lyft equivalent operating in this space. It's Amazon.  Amazon has a large delivery fleet.  It is tasked with getting packages "the last mile" from the fulfilment center to the customer's door.  Amazon is said to be working busily on an all-electric delivery vehicle.  But so far, it's all talk and no actual vehicles on the road.

I find that quite surprising.  Tesla is working on an "18-wheeler" long haul truck.  I find it difficult to believe that is practical given the current state of the art when it comes to batteries.  But it looks completely practical to do a delivery van.  Or, for that matter, any commercial vehicle that "comes home" every night, and that racks up a modest number of miles per day.

There are currently gas stations everywhere.  It takes five-ten minutes to "gas up".  More and more charging stations are being installed.  But all-electric vehicles don't recharge in five or ten minutes.  You are lucky if it can be pulled off in five hours.  That means that a successful all-electric vehicle needs to be based somewhere and not wander too far from base.

New houses now often feature a 220 volt circuit in the garage.  This is easy to do.  The circuitry is the same as that used to support electric stoves, dryers, and water heaters.  A house may come with the natural gas versions of these appliances.  But any commercial electrician knows how to string the necessary wiring.  It is easy to include in a new house.  It is harder, but usually not that much harder, to retrofit such a circuit into an existing house.

Almost all all-electric vehicles are now sold to consumers that can afford to own multiple vehicles.  They can use their all-electric vehicle for their short trips, and most trips are short tripe.  When they occasionally need to go a long way then they can use one of their other vehicles.

That is not, and will never be, a large part of the consumer vehicle market.  All-electric vehicles need to sell in large numbers if the cost is to be driven down.  Short haul delivery vehicles should enlarge the market substantially.  And that's good.

There are some other market segments that all-electric vehicles should eventually be successful in.  But their success will be limited until the main problem with electric vehicles is solved, the creation of a much better battery.

Finally, I want to talk about supercars.  These have been around for something like twenty years.  They certainly didn't exist when I was a kid.  The situation in the '60s was typical of what came before and what continued for some time after.

When I was a kid the fancy car was the Cadillac.  Sure, a few Hollywood Moguls and the like drove (or were driven in) a Rolls Royce.  But that was more of a fantasy than a reality.  And here's the thing.  A Cadillac didn't cost that much more than a regular car.

My dad bought a Plymouth in the '60s.  The Plymouth was a step up from the Dodge.  The Chrysler was a step up from the Plymouth.  The Chrysler was supposed to be comparable to a Cadillac.  But there is only one Cadillac of Cadillacs.

The Cadillac was the "top of the line" in its time.  But Cadillacs didn't cost all that much more than the Chevrolet, the "economy" entry in the General Motors product line at the time.

My father's Plymouth cost a little under three thousand dollars.  A Chevrolet of the period cost about two thousand dollars.  But the Cadillac only ran four to five thousand dollars.  The "top of the line" car of the time cost between two and three times what an "economy" car cost.

If we update the numbers to what they now are then an "economy" car runs between twenty and thirty thousand dollars.  Picking the high number would mean that a "top of the line" car now costs between sixty and ninety thousand dollars. And you can drive a brand new Cadillac off the lot for sixty to ninety thousand dollars.  So that part tracks.

But ninety thousand dollars doesn't even you get you in the door when it comes to super-premium cars now.  Those run hundreds of thousands of dollars.  And that doesn't even get you close to the top of the market.  Ultra-premium cars now run between a million and ten million dollars.

And it's now been a long time since Rolls Royce was as expensive as it got.  You can drive the Rolls Royce of your choice off the lot for only a few hundred thousand dollars.  It's still a super-premium car but it's not even close to being an ultra-premium car.

What happened, of course is that the rich got a whole lot richer and these people needed something to spend their money on.  There are now lots of people who have fleets consisting of dozens of ultra-premium cars.

I don't know what they do with them.  Traffic speeds have stayed the same since the '60s.  You can easily rack up a very expensive speeding ticket in an economy car.  So, there's literally no place to take advantage of what justifies the price of these vehicles.

And they are not "lap of luxury" vehicles.  The "go to" luxury limousine is a stretched version of the Chevrolet Suburban (or any large SUV like the Ford Expedition).  These vehicles feature roomy interiors with lots of headroom.  The interiors can be customized to make them quite luxurious.  But, no mater how tricked out the interior is, you are still only talking hundreds of thousands of dollars.

All the ultra-premium vehicles, the ones that cost a million or more, are set up as sports or performance vehicles.  The interiors may be well appointed.  But they are cramped.  Few feature more than minimal storage space.  But they all look like they go super-fast.  And most of them actually do.

In the late '60s the goal of "hot car" guys was to have a car that could go two hundred miles per hour.  Few of the vehicles of the period, even the custom ones like the Shelby Cobra, actually could even touch two hundred briefly.  And none of the "street legal" ones could sustain two hundred for any period of time.  That has changed.

The "production" versions of many of these cars are street legal and they can easily maintain a sustained speed of two hundred miles per hour.  And when lots of cars from lots of manufacturers can all do the same thing, it's time to up the ante.  This car can do two twenty.  No, that car can do two thirty.  And so on.  The goal recently became three hundred miles per hour.

And Bugatti the first car maker to pull it off.  They were able to get a "prototype" version of one of their production cars to clock in at 300 MPH while conforming to all the conditions necessary to be officially credited with the record.

The car was slightly modified "for safety reasons".  I don't know if Bugatti will even sell a car set up the same way the "300 MPH" car was.  The actual production (i.e. non-prototype) version of the car comes equipped with a "limiter" that makes the car top out at 261 MPH..  And plan on laying out $3 million or more for the "limiter" version.  Who knows what the non-limiter version would cost.

Bugatti said 300 MPH was fast enough.  I don't understand that.  Because 500 KPH is only 311 MPH.  Surely, they could have gotten the car to go a measly 11 MPH faster.  And 500 KPH is a much more satisfying number.  This is especially true since most of the world runs on KPH rather than on MPH.  If it was me, I would have gone for 500 KPH.

And, apparently I'm not the only one that thinks that way.  An outfit I had never heard of before, even though it is located in Washington, the state I live in, decided to go for it.  The company name is SSC, short for Shelby Supercar.  (And this Shelby is apparently no relation to Carol Shelby of Shelby Cobra fame.)  They make a street legal car called the Tuatara that they thought would go 500 KPH.

They were right.  They too made a run under the appropriate conditions.  Their official speed was 508.73 KPH (316 MPH).  They claimed that the car was unmodified.  It did use "race" fuel, a type of fuel that is commonly found at drag strips and race tracks.  But that was it.

And there have since been claims of "discrepancies".  This has caused the company to promise to redo the run under the most stringent of conditions.  The entire production run is already sold out.  But, if you could order one, your very own Tuatara would only set you back a measly $1.6 million.  A bargain, wouldn't you say?