Monday, April 6, 2020

Financial Panics

Long ago in a Galaxy far, far away . . .  I'm not going to go that far but you know I like my historical context.  I am going to talk about the current economic situation, call it the COVID-19 Panic.  But first, as advertised, some background.

In the middle ages most people lived on farms that existed within the domain of the local feudal lord.  There was no money to speak of and what passed for government was provided by the lord.  He provided for the common defense, maintained the roads, and provided a few other basic services.  He levied a tax that consisted of some portion, traditionally a tenth, of whatever the farmer produced in return.  The important thing was that all economic activity was local.

In the few larger cities and towns Goldsmiths and Silversmiths started providing a service to the rich and powerful.  They would store their valuables in a safe or strong room.  Using this shared resource cut down on costs.  And the smarter of them observed that only a few people needed their valuables at any one time.  So they could make a tidy buck on the side by loaning out some of the stored valuables at interest.  This was the beginning of the banking system.

And it eventually occurred to the rich and powerful and those that they did business with that there was a better way to do so than using the valuables themselves, the plate silver, coins, etc.  Instead, a piece of paper called a "draft" could be used.  The draft promised that "valuables in an amount equal to whatever" would be handed over to the bearer ""on demand".

These "demand drafts" were was the foundation of currency.  And this new and improved way of doing business quickly became very popular.  As a result, some "banks" (places where you "banked" your valuables) got so big and successful that they were eventually able to finance the governments of whole countries.

A country like France would issue a certain amount of "coin of the realm".  But for really big transactions, ones involving more than a bag or two of coins, they came to depend on demand drafts drawn against banks run by the Rothchild family, for instance.

The Rothchild's were one of the first to go international.  Various family members set up banks in various large cities in various countries.  Money in large amounts could be moved from place to place by using a demand draft drawn on a Rothchild bank at one location.  It would be honored by Rothchild banks operating at other locations, perhaps even at a location located in another country.  This became common practice in Europe.

The US ended up with something similar but less well developed.  Various government sponsored "national" banks were created.  But politics did them in.  So, for a big chunk of the history of the US, there were only small banks that confined their operations to relatively small geographic areas.

And for the most part that worked fine.  Most people traveled no great distance.  And the advantage of paper money were so obvious that for a long time it was the custom of each bank to issue its own currency.  It was not only the done thing but it was completely legal at the time.

At this time the Federal government was tiny.  Like the feudal lord of yore it provided only modest services.  It provided for the national defense, maintained some roads, ran a postal service over those roads, and that was pretty much it.  Almost all revenue came from tariffs assessed on imported goods.  The Federal government had little or no daily impact on most people most of the time.

But then railroads and the Civil War came along.  Railroads were the first large business that operated on a national scale in the US.  They needed a banking system that was national in scope and big enough to serve their needs, which dwarfed anything that had come before.  Sadly, there was nothing in place to meet these needs.

The Civil War was an enterprise whose scale dwarfed even the largest railroads.  Soon after the start of the War the Union government was raising and spending unbelievable sums of money.  The Confederate government was smaller but it too was soon operating on a hitherto unheard of scale.

The War cost more than a million dollars a day to prosecute.  That was sum far beyond the capabilities of the then extant banking system.  As a result, the US government, which had previously confined itself to minting coin, got into the business of printing paper money, called "greenbacks" for the color of ink used then and now.

All this was grafted on top of a wholly inadequate system.  And banks were still issuing currency backed only by their good name and reputation.  Back in the time before the War and before the railroads, this business of each bank issuing its own currency worked, sort of.

Local people knew their local banker.  If they spotted something hinky, the thinking went, they could pull their money out before the bank went under.  Only suckers (people who lived some distance away) would get hurt when that bank's currency became worthless.

But after the Civil War got under way that became impractical.  And soon there was fantastic amounts of money sloshing around as the railroads, then the oil industry, the steel industry, etc. became giants.  As a result, we had a "panic" (that's what they were called) about every ten years.

If a small bank here or a small bank there all of a sudden went under it was hard on the locals but had no wider impact.  But after the War ended various developments like the railroads operating on a national scale knitted banks together.  So you now had the possibility of a serious ripple effect.

One bank going down could cause a run on another bank and the situation could now spiral into an event that had a broad impact.  It didn't happen every time.  But it happened often enough, about once every ten years, on average.

Finally, in 1914 the Federal Reserve was chartered.  The thing that allowed it to survive political challenges that had taken down it's predecessors was it's "economic stability" mission.  It was supposed to stop these very destructive panics from happening.

It was not completely successful.  Banks could still go under and people could still lose their money.  The possibility of a panic was enough to cause people to start a "run on the banks".  Panics happened often enough that people were all too familiar with them.

Bankers of the time had the same problem that their Rothchild era predecessors had.  Everything was fine as long as everybody didn't try to pull their money out at the same time.  (And assuming the banker wasn't an outright crook or completely incompetent.)

A run didn't require everybody to demand their money.  It only took a significant portion of depositors demanding their money, all at the same time.  That was enough to turn a sound bank into an unsound bank.

What finally fixed the problem was the introduction of a government agency called the FDIC in the '30s.  It guaranteed that people would get their money even if the bank went under.  Once the Fed and the FDIC were in place we went a long time without anything that looked like a panic.  People and companies could go about their business without worrying about whether or not the bank they used in was in danger of going under.

But this Fed/FDIC solution eventually became a victim of its own success.  The fact that nothing had gone wrong for a long time caused people to forget what a panic looked like.  That amnesia gave political factions the opportunity to weaken the system so they did.  So we went back to a point where panics became possible again.  And with that, I would like to look at some panics.  And I am going to start with the big one, the Great Depression.

We all know that the Stock Market crashed in 1929.  But at the time the Market was not that big of a thing.  Only a few rich people owned stocks.  And the fact that a company's share price had tanked had little or no effect on the day to day operation of that company.  But important people were harmed by the Crash and they forced the Federal Government to take action.

It was not the crash itself but the response to the crash that did most of the harm.  Under Hoover the Federal Government's response was to do what everybody thought at the time was the right thing to do.  That was to clamp down on the banking system.  Bank reserves must be increased.  Loans must only be made to blue chip clients.  Unlike with the crash itself, these changes had a widespread impact.

Many businesses both large and small needed access to credit.  All of a sudden only the people who didn't need credit could get credit.  That caused businesses large and small to pull back.  They started cutting back on production and laying people off.  That meant business dried up for suppliers who responded by laying people off.  Laid off people (there was no unemployment insurance at the time) were forced to cut back on their expenses so they bought less.  And the spiral continued.

In the mean time, banks had less money to loan because they had to hold more back to meet the higher reserve requirements.  And at the hint of any weakness on the part of any bank customers started a run on that bank.  If the bank had been sound before it soon wasn't.  That spawned even more runs as people and businesses got more and more concerned about losing money they couldn't afford to lose by keeping it an a bank that might fail.

The Hoover Administration kept tightening and tightening.  And things kept getting worse and worse.  That ushered in the Roosevelt Administration.  They put in the FDIC and mandated a "bank holiday", temporarily closing all banks.  During the holiday every single bank still in existence was audited.  Some banks were found to be unsound and they were closed.  But most banks were found to be sound.

Once the holiday was over and a bank opened back up its customers were told that it was "backed by the full faith and credit of the Federal Government".  People believed the promise, which was kept then, and is kept now.  Bank runs became a thing of the past.  That didn't fix the economy but it did fix the banking system.

The next time there was a significant problem in the banking system took place roughly fifty years later.  It is usually called the S&L Scandal.  But I'm going to call it the S&L Panic.  It turns out that "there are banks and then there are banks".  (Unfortunately, I will need to return to this later.)

The FDIC promise that a "bank" was backed by the full faith and credit of the Federal Government applied only to "Federally chartered commercial banks".  What?  If a bank has the phrase "National Bank" in it's name then it's covered by the FDIC.  But there are other kinds of what most people think of as a "bank".  The two biggest groups of these other types of "banks" are "Savings and Loans" and "Mutual Savings Banks".

There are technical differences between the two.  But for our purposes we can ignore these differences and lump them together.  Both were considered to be "community banks".  They dealt with consumers by offering car loans, mortgage loans, and the like.  They also took deposits.  But they couldn't provide standard checking accounts.  Nor could they make loans to businesses.

The limits on the types of activities they could engage in was supposed to make them smaller, less risky operations.  So they didn't require the intrusive business standards and audit requirements that the FDIC imposed on Federally chartered commercial banks.  (It turns out that there are also state chartered banks but I am going to ignore them.)

They were not part of the FDIC insurance and regulatory system.  Instead, they had their own insurance system and regulatory agency.  But both were not anywhere near the industrial strength operation the FDIC was.  The thinking was that they didn't need to be.

Anyhow, the people who ran the S&Ls and MSBs agitated to give these institutions more bank-like capabilities.  In the deregulatory era of the time, they got their wish.  They could provide checking accounts, their ability to make loans was greatly expanded, and so on.  They could have been folded into the FDIC system.  But they liked the looser regulatory environment and lower cost of insurance they were used to.  They managed to keep it.

And the hotdogs that had been behind the change behaved like hotdogs.  And promptly got into trouble.  There were lots of crooks who did crooked things.  There were lots of incompetents who did incompetent things.  As a result lots of these institutions got into lots of trouble.

And the FDIC-lite insurance system they were using wasn't up to the task of covering their losses.  Nor was the FDIC-lite regulatory agency they reported to able to keep them in line.  So naturally they applied to the Federal Government to bail them out.  And they got their wish.

This cost taxpayers tens of billions of dollars.  In the end the remaining institutions were theoretically on a similar sound footing as the FDIC insured national banks.  The bad news was that it wasn't really true.  The good news is that a lot of people went to jail.  Not enough, but still some is better than none.

This "put them on a sound footing" business was true to some extent.  But a lot of it was a sham.  They still had their own FDIC-lite regulatory agency.  It was beefed up but still that wasn't nearly as stringent as the FDIC.  And the insurance requirements were improved but remained FDIC-lite too.  But people pretended that the problem was solved and moved on.  And for a long time nothing happened to contradict this happy picture.

Next in line is what I call the Dot Com Panic of 2001.  I have a low opinion of the movers and shakers on Wall Street.  So, when things go wrong, my first instinct is to blame them.  But this is one of those rare examples where it was not their fault.  The general public did this mostly on its own.  They got little or no help from the usual cast of Wall Street slime balls.

The IBM PC was introduced in 1981.  It was not the first PC but it made a very big splash when it arrived on the scene.  And the fact that it came from IBM, then a giant and well respected company, legitimized the whole thing.

And it turned out that it didn't take much time for smaller, smarter, and more nimble companies to figure out how to cash in.  Microsoft was one of the first.  They got in bed with IBM.  That's usually a recipe for getting swallowed whole.  But Bill Gates outsmarted IBM and emerged on top.

Compaq Computer, then (and unfortunately also now) a company nobody had ever heard of, pioneered the process of legally cloning the IBM PC.  The result was a "compatible" computer that would run all the IBM software but which cost significantly less.

Compaq (and the other clone companies that followed in its footsteps) made a ton of money.  With it now established that "there's money in them there computers" lots of companies started piling in.  And for a long time personal computers (and later the internet) looked like a license to print money.

Both Microsoft and Compaq were financially sound companies.  But then we started seeing companies who said "trust me -- we'll eventually make a ton of money".  The public bought the argument and bought the stock.  So the share price of these stocks went up and up and up.

At first Wall Street steered clear.  They just couldn't figure out how these companies could make enough money to justify their share price.  In many cases, Wall Street couldn't figure out how many of these companies could ever make any money at all.   But the public ignored the advice of Wall Street and bought the stock anyhow.  People made a whole lot of money in a very short amount of time.

Wall Street did eventually climb on the band wagon.  But they never went all in.  And they certainly weren't driving this particular train.  And it turned out that Wall Street was right.  Many of these companies never made any money.  Others made some money but not nearly enough to justify their share price.

And eventually the public caught up with Wall Street and started selling.  And prices went from the stratosphere to the cellar almost instantly.  But by this time these "dot com" companies made up a big chunk of the Market.  So when they went down the market as a whole went down.  People who invested aggressively in these stocks lost 90% of their money.  Conservative investors like myself lost 20-30%.

This shook up Wall Street.  And individuals who had invested heavily lost heavily.  The Dot Com Panic depressed the Market in particular and the economy as a whole in general.  By this time most people had some money in the Market.  Often it was in a company sponsored 401-k.  So the pain was widespread.  But for most people (and for Wall Street) the pain was modest and short lived.  I got all my money back and more within a couple of years.

And that leads me to the "Panic of '08".  This was a typical panic in that Wall Street had its greasy fingers all over it.  In spite of the fact that taxpayers had ended up ponying up tens of billions of dollars as a result of the S&L Panic, the pressure for deregulation continued.  In fact, it probably increased.

In the run up to the Panic of '08 the patchwork of regulatory agencies, audit requirements, reserve requirements, etc. that had built up around "banks" had not been fixed.  Bank-like institutions were allowed to pick their regulator.  Not surprisingly, they tended to pick the one that regulated the least.

And it turns out (as I warned you above) that there is yet another kind of "bank".  Nationally chartered and FDIC insured banks are called "Commercial Banks".  They do business with ordinary people and all kinds of companies.

But there is another kind of bank called an "Investment Bank".  They are children of Wall Street and, the story goes, they only deal with sophisticated customers who have considerable expertise in investments, banking, risk, etc.  So Investment Banks neither need nor want to be subjected to FDIC regulation.  This argument worked.  They were not subject to FDIC oversight.

And for a long time this seemed appropriate.  If an Investment Bank got into trouble then the only people who would suffer a loss were sophisticated Wall Street types.  There was even a law in place to force this.  But Glass-Steagall, as the law was commonly called, was repealed in 1999.  It had said that a bank can be a Commercial Bank or it can be an Investment Bank but it can't be both.

For a long time Investment Banks had been very profitable, far more profitable than Commercial Banks.  But as banking laws changed over time it eventually became possible for Commercial Banks to grow quite large and to diversify into many lines of business.  They saw owning an Investment Bank as a wonderful business opportunity and as the next obvious diversification step.

And by this time some "little" Mutual Savings Banks (the cousin of a Savings and Loan) had gotten quite big.  At one point Washington Mutual, a Mutual Savings Bank that started in my home town and was still headquartered there, was the fourth largest "bank" in the US.  WaMu, as it was called locally, had aggressively shopped for a regulator that was completely unprepared to handle an institution of its size and complexity.  That made it easy to fly under the radar.

And lots of companies were flying under the radar.  They structured their operations so as to avoid all but the most minimal regulatory and audit requirements.  Then they started writing mortgages that were highly unsound.  I'm going to skip over the details.  (If you want more on the subject, it can be found here:  http://sigma5.blogspot.com/2013/04/speculative-bubbles-part-2-of-2.html).  But wait, there's more.

Wall Street thought these unsound mortgages represented a great opportunity.  They packaged them up in such a way that they looked like "AAA" super-safe investments and started peddling them to one and all.  Who wouldn't want a super-safe investment that paid a high rate of return?  So they had no trouble selling all they could manufacture.

Eventually reality caught up with the unsound mortgages.  A lot of regular people lost their homes. And, after a short delay, these so called super-safe investments started going bad.  And they took the people who had invested in them down with them.

And the people who had been front and center on all of this had been the Investment Banks.  So they started going down.  Investment Banks going down was only supposed to hurt sophisticated Wall Street types.  But by now it was all tied together.  So the Commercial Banks were soon in big trouble.

The result was the TARP, the Troubled Asset Recovery Program.  TARP bailed out Wall Street and the banking system.  Since we all use the banking system and most of us have at least some money in Wall Street this was good.

But none of the people who lost their homes or were laid off as business took a dive were bailed out.  And nobody went to jail even though there was widespread lawbreaking.  WaMu went from being the fourth largest bank in the country to being the largest US bank to ever to go bankrupt.

Wall Street came back.  And it didn't take all that long.  I suppose that, given the amount of money the Federal Government poured into them, that's not very surprising.  And, after a long hard slog lasting roughly a decade, main street had finally pretty much recovered when COVAD-19 showed up.

To be fair, the COVAD-19 Panic we are now in, is another of those "not Wall Street's fault" panics.  Wall Street is having just as much trouble as everybody else in trying to cope with it.  We've all been forced to become COVAD-19 experts.  Social distancing, stay-at-home orders, and closing down "non-essential" businesses strikes at the heart of the economy.  And a lot of people are getting very sick.  This too sucks up a tremendous amount of resource.

Given all this, what's is the economic lookout?  The short term answer is easy:  It will be devastating.  As I write this ten million people have filed for unemployment in just two weeks.  A lot of the country, including the part I live in, has been locked down for weeks.  The rest of the country will soon be locked down.  While that's going on the economy will head straight into the toilet.  And it will stay there as long as the lockdown continues.

Sharp economic shocks, if they are of short duration, can produce swift rebounds, a so called "V" curve.  The Pollyanna's among us are hoping this is what's going to happen with respect to the COVAD-19 Panic.  But from an economic perspective (and from many other perspectives too) this is an unprecedented event.  It is not like any of the other Panics I have discussed.  So they don't really provide much in the way of guidance.

The event that is most similar to is one I haven't discussed.  A "Spanish Flu" Pandemic swept the world in 1918 and 1919.  Broadly speaking, COVAD-19 is a Flu.  So was the Spanish Flu, named because it was first identified in Spain.  It didn't start there, but by the time scientists figured that out, the name had stuck.

The Spanish Flu panic is inextricably connected to World War I.  It started just as that War was winding down.  And the devastation and unsanitary conditions that were part and parcel of the War helped get it firmly established.  Like COVAD-19, the Spanish Flu was almost impossible to stop once it got established.  Like COVAD-19 it swept across the world.  It would pop up here then pop up there.

There are differences in behavior between COVAD-19 and the Spanish Flu.  But they are unimportant with respect to our discussion.  A lot of people got sick.  A lot of people died.  Medical infrastructure got overwhelmed.

Of course, back then the medical infrastructure was not as sophisticated as it is now.  For instance, the ventilator had not even been invented yet.  But what they did have was the ability to manufacture cheapo masks in large quantities.  So they did.

There are pictures of people out in the streets of Seattle at the time.  Everybody was wearing masks.  It was the law.  Of course, back then you could get masks in Seattle and pretty much anyplace else in the world.

With all of our modern sophistication cheapo masks are now a single source item.  They all now come from China.  China was hit hard and hit first by COVAD-19.  This put a dent in their ability to manufacture things.  But they are now in the process of restarting their manufacturing sector.  And they made zillions of cheapo masks for domestic use as part of their strategy for combatting COVAD-19.

The problem we in the US have is that there is a trade war going on between us and China.  This makes it hard to import the tens of millions of cheapo masks we would need to make a mandatory "mask while you are in public" order practical.  So such an order has not been issued.

People have stepped in with homemade cloth masks.  They are better than nothing but they are not even as good as the standard cheapo mask.  And you need a much fancier N-95 mask for the mask to provide a serious level of protection.

The Spanish Flu definitely depressed the economy.  How much?  We don't really know.  The end of World War I also depressed the economy.  How much should be attributed to one versus the other is something nobody knows.  But it looks like even if we attribute all of the economic damage to Spanish Flu and none of it to World War I, the economic impact of the COVAD-19 Panic will be much larger.

How much larger?  We don't know.  The US response has been disorganized and confused.  And this applies to both the medical side and the economic side.  There things both medical and economic that can still be done to reduce the damage and to bring the COVAD-19 panic to an end more quickly.  So far this is not being done.  And we don't know when that will change.

People have been studying epidemics and pandemics as a medical problem for a long time.  If you can give the experts a few key numbers they can tell you how severe things will get and how the medical part will play out over time.

We have been measuring these numbers.  The best you can say so far is that results are mixed.  Some places are taking the appropriate actions.  But lots of other places aren't.  It is barely possible to close international borders so we aren't harmed by bad behavior elsewhere.  But that can't be done in the case of state and regional borders.  Besides, judging by current statistics, we are one of the places engaging in bad behavior.

The obvious thing to do is to institute national measures to mandate good behavior.  That hasn't happened.  Unfortunately, it may never happen.  What's stopping us is not medical or economic considerations.  It's political considerations.  And that makes COVAD-19 a political problem before anything else.  I wish it were otherwise.

Saturday, March 28, 2020

60 Years of Sceince - Part 17

This post is the next in a series that dates back several years.  In fact, it's been going on so long that I ended up upgrading the title from "50 Years of Science" to "60 Years of Science".  And, ignoring the change, this is the seventeenth entry in the series.  You can go to sigma5.blogspot.com/2017/04/50-years-of-science-links.html for a post that contains links to all the posts in the series.  I will update that post to include a link to this entry as soon as I have posted it.

I take Isaac Asimov's book "The Intelligent Man's Guide to the Physical Sciences" as my baseline for the state of science when he wrote the book (1959 - 60).  In this post I am reviewing what he reported then noting what has changed since.  For this post I will be reviewing two sections:  "Heat" and "Mass and Energy".  Both are from the chapter he titled "The Waves".

As he points out, many discussions of "light" are accompanied by a discussion of "heat".  A candle, for instance, gives off both light and heat.  The quantitative (assigning numbers to things) as opposed to qualitative (generalities such as "it's hot out today") understanding of this subject was completely missing until what he describes as "modern times".  If you can't measure it, you can't study it quantitatively.

The observation that kicked off the change from qualitative study to quantitative study was the observation that warming many materials up caused them to expand.  Galileo kicked things off in 1603 by plunging a tube of heated air into room temperature water.  The water cooled the air, which compressed and drew water up into the tube.  He called the device a "thermometer".  Unfortunately, the height of the water in the tube could be changed not only by changes in room temperature but also by changes in air pressure (a phenomenon not well understood at the time).

In 1654 the Grand Duke of Tuscany came up with a better design.  He sealed the tube.  That fixed the problem caused by changes in air pressure.  He also switched to a liquid.  To magnify the change he placed a large bulb full of the liquid at the bottom of the tube then forced the liquid to expand up a narrow tube.  If that sounds familiar, it's because all thermometers were designed that way until the electronic thermometer took over.  And that change happened long after Asimov's book came out.

The Duke's design was good enough to permit some serious science to happen.  Boyle figured out that human body temperature is (relatively) constant and substantially higher than any ambient temperature people find comfortable.  Amontons led the switch from water to mercury as the liquid in the tube.  "Mercury thermometers" were ubiquitous until a decade or so ago.  They are still pretty easy to find.  But, given that electronic thermometers are now cheap and contain no dangerous mercury, I don't expect that to last much longer.

Fahrenheit added a scale.  Much of the world, including the US, still lives by his scale. On his scale water freezes at 32 degrees and boils at 212 degrees.  On the Celsius scale (invented by a Swedish astronomer named Celsius in 1842) that is part of the Metric System that the rest of the world uses, these numbers are 0 degrees and 100 degrees.

Originally called Centigrade, in 1948 various tiny technical changes were made and the name was changed to Celsius.  Given the rampant hostility to science that various groups succeeded in have fomenting, it is unlikely that the US will switch from Fahrenheit to Celsius any time soon.

Temperature is a measure of intensity, not quantity.  In 1760 Black started measuring how much heat it took to change the temperature of various materials by a degree.  It turns out that this quantity varies a lot.  A further source of confusion came from the fact that under certain circumstances you can insert heat and the temperature doesn't change.  If you add heat to ice the temperature stops changing when it reaches 0 degrees Celsius (see how much handier the Celsius scale is).  Instead, some of the ice melts.  There's nothing simple about heat.

What really sent the study of heat into high gear was the invention of the steam engine.  The people who bought them didn't just care that they worked.  They also cared how much they cost to run.  If the same job could be done with less fuel (and fewer people to feed the fuel into the engine) then that was a good thing.  But to understand how to make steam engines more efficient scientists had to understand how heat worked.

The first "theory of heat" was that it consisted of something called "caloric".  Various materials contained various amounts of caloric, which could flow from here to there, presumably according to a set of rules.  But no matter what rules they came up with, one obstacle or another inevitably popped up.

Scientists hunted for alternatives and eventually came up with the idea that heat was the manifestation of some kind of vibration. Thompson studied the way cannon barrels were bored, a process that produces tremendous quantities of heat.  He decided that the mechanical friction of hard metal scraping against hard metal was causing some kind of vibration.

Davy then caused two pieces of ice to be rubbed together in a way that produced no caloric and observed that the ice melted.  Caloric couldn't explain the result.  But again mechanical friction could produce some kind of vibration that could.

Several scientists, most notably Carnot, studied how heat flowed.  This later led scientists to crown Carnot as the founder of "thermodynamics", the study of heat and heat flow.  Carnot developed a theory that explained how steam engines worked.  The theory told scientists and engineers what to do to make them more efficient.  It also allowed them to calculate exactly how efficient a steam engine could possibly be made.  No actual steam engine is anywhere near as efficient as theory says it can be.  But they are now way more efficient than early designs were.

Another pioneer was Joule.  He spent 35 years studying how heat behaved in various situations.  He developed Joule's Law:  A given amount of "work" always produces the same amount of heat.  And that meant that heat was just another form of energy.  This led to the idea of "conservation of energy".  Unfortunately for Joule, it was Helmholtz who formally proposed the idea in 1847.  Conservation of Energy means that you can convert energy back and forth from one form to another. But you can neither create nor destroy it.

At roughly the same time it was observed that, with one exception, the conversion of one form of energy to another was never 100% efficient.  Every time you did a conversion you got some heat whether you wanted to or not.  So the only 100% efficient conversion is from any other form of energy into heat.

A study of the opposite, turning heat into other forms of energy, resulted in the introduction of the concept of "Absolute Zero".  (Asimov doesn't talk about it here but that doesn't stop me from talking about it below.)  The process of converting heat into other forms of energy involves two reservoirs; a hot reservoir and a cold reservoir.  Heat can be turned into other forms of energy by taking some of the contents of the hot reservoir and reducing its temperature to that of the cold reservoir.

It turns out that's what a steam engine does. You heat water and turn it into steam.  That's the hot reservoir.  The general environment is the cold reservoir.  If you process the steam cleverly it's temperature is reduced to that of the cold reservoir and energy is available to turn a flywheel.  But the temperature of the cold reservoir imposes a limit on how much of the energy present in the hot steam is available to be converted into the energy of mechanical motion.

The laws of thermodynamics don't let you cool the steam to a temperature below that of the cold reservoir.  But it's worse than that.  It turns out that as a practical matte, some of the heat goes to warm up the machinery and for other non-productive purposes.  The first law of thermodynamics is waggishly stated as "you can't win".  The second law is waggishly stated as "you can't even break even".  And a similar rendering of the third law yields "you can't even get out of the game".

The part of the theoretically available energy that is actually available is called "free energy".  The part that is inevitably lost eventually became associated with the term "entropy".  Entropy always goes up.  Clausius invented the term in 1850.

At this point scientists knew in general terms how things worked.   But they had no idea how the underlying mechanism worked.  In 1870 Maxwell and Boltzmann developed the "kinetic theory of gasses".  Heat came from the microscopic vibration of each molecule of gas.  It turns out that molecules in a liquid can vibrate.  Even molecules in a solid can (and do) vibrate.  That's where the energy that heat represented was hiding.  There was no caloric fluid.

Vibrating molecules can pass on their vibrations to other molecules.  The energy contained in a certain rate of vibration depends, among other thins, on the weight of the vibrating molecule.  The details are complicated.  But the bottom line is that scientists figured out how to make this vibration approach explain all the details of how thermodynamics worked.

The energy involved in melting ice (or turning liquid water into steam) could be explained by the energy necessary to break (thaw, boil) the bonds that make a solid a solid (or a liquid a liquid).  The same was true of the freezing (and condensation) processes.  Making the bonds freed energy.

And in 1870 Gibbs extended the idea to chemical bonds.  Chemical processes could be explained by attributing a certain amount of energy to a chemical bond.  The energy was released when the bond formed and absorbed when the bond was broken.  That brought chemistry into the thermodynamic fold.

That brings us to the section titled "Mass to Energy".  Radioactivity, discovered in 1896, initially presented a challenge.  The energies involved were gigantic.  Where could that much energy come from?  Einstein supplied the answer with his famous "E" equals "M" times the square of the speed of light.  "E" is energy.  "M" is mass.  It turns out that a tiny amount of mass contains a gigantic amount of energy.  The square of the speed of light is just a truly enormous constant number.  But all it does is tell you exactly how much energy you get when you annihilate a tiny amount of mass.

Doing away with a tiny amount of matter is all it takes to produce the enormous amounts of energy we see coming from radioactive decay.  And, of course, if you annihilate a "large" amount of mass, say a pinch of salt's worth, you get enough energy to level a city.  Atomic (fission) and Hydrogen (fusion) bombs are just machines for the annihilation of what would otherwise be considered small amounts of mass.

A side effect of all this was the loss of Lavoisier's "conservation of mass" law.  It was replace by the "conservation of mass-energy" law.  And Einstein's idea soon transitioned from the theoretical to the practical.

Aston was able to experimentally confirm that Einstein's equation was correct.  He was able to make measurements involving radioactive decay that were delicate enough to measure the mass loss in some situations.  It was the right amount to match the amount of energy that was produced when, of course, you used Einstein's conversion factor.

Before I finish, I want to cover one subject that Asimov didn't.  Scientists were able to do complex sets of experiments and calculations to determine how much energy was produced by reducing the temperature of water from say a hundred degrees to zero degrees.  (We are using the Celsius scale here where water boils at 100 and freezes at zero.)    It turns out the process released 373/273 of the energy theoretically available.  Other experiments produced a similar result if you just added 273 to all the temperatures.

That got these scientists to ask themselves "is there such a thing as the lowest possible temperature, an absolute zero?"  If there was, it appeared to be -273 degrees.  (It's actually 273 and a fraction but I am going to ignore the fraction in order to keep things simple.)  That led to the development of the Kelvin temperature scale.

A Kelvin degree is exactly the same as a Celsius degree.  But 0 degrees Celsius is the same as 273 degrees Kelvin.  100 degrees Celsius becomes 373 degrees Kelvin.  If you convert all temperatures to Kelvin then the ratio of the temperature of the hot reservoir to the temperature of the cold reservoir gives you the answer the question of how much heat energy can be converted to free energy.

Since the environment tends to be at roughly 300 degrees Kelvin (80 degrees Fahrenheit) you have to operate your hot reservoir at 600 degrees Kelvin to have access to even 50% of what's theoretically available.  That's about 620 degrees Fahrenheit.  This kind of analysis explains why engineers are always trying to increase the "operating temperatures" of things like jet engines.

An operating temperature of 1,100 degrees Celsius, something that the most efficient jet engines can now do, translates to a Kelvin temperature (again ignoring the fraction) of 1,400 degrees.  That means that over 70% of the heat energy can theoretically be turned into free energy.

Cars, whose engines don't run anywhere near that hot, are doomed by thermodynamics to have a very low percentage of the heat energy the fuel produces translated into the free energy that can be used to "make the wheels go round and round".

Bottom line:  Very little has changed in these areas.  These subjects are foundational.  And, for the most part, the foundations in these two areas were laid well before Asimov wrote his book.  Science has since built on these areas.  But for these specific areas, the foundations themselves have seen no changes.  And little of a foundational nature has been added in the interval since the book came out.

Friday, March 20, 2020

Epidemic Explainer

COVAD-19 is all anyone is talking about.  There is a lot of misinformation out there.  It is easier to sort the wheat from the chaff if you understand the basics.  And an important aspect of "the basics" is mathematics.  A lot of the people trying to talk about the subject are either poor at mathematics or assume it's something they should stay away from.  I am going to take the opposite tack.

The mathematics of how epidemics evolve is the same mathematics as interest (the kind you are charged on your credit card balance) and radioactivity.  The generic term for it is "exponential growth".  If you understand exponential growth you are a long way toward understanding the process as a whole.

We are used to and intuitively understand additive sequences.   If you start with zero and keep adding two to it the result keeps getting bigger.  This is an example of "linear growth".  If we start with zero and add two every week then after a week our total will be two.  After another week it will be four.  After still another week it will increase to six.  In fact, we can just multiply the number of weeks by two and we will get the result.  After 52 weeks, roughly a year, our result will be 104.

Linear sequences grow slowly even if they continue indefinitely.  If you wait long enough you can reach any number you want.  It's just that if you need to reach a large number it will take a long time.  If we have a disease that infects a thousand people per week then it will take seven million weeks to infect everyone in the world.  That's over a hundred thousand years.  It's also something no one would be concerned with even though a thousand cases per week sounds like a lot.

Now let's change things up.  Let's start with a series whose first two elements are one and one.  But now let's create the next number in the series by adding the two most recent numbers in the sequence together.  You might think that things will go pretty much the same way as the "add two each time" rule.  But they don't.

The third number in the series is two (one plus one).  The fourth number is three (two (one back) plus one (two back)).  This still doesn't sound like it will get out of hand.  But 3+2=5 then 5+3=8 then 8+5=13, then 21, then 34, then 55, and so on.  And that's only the first ten numbers in the sequence.

And this sequence has a name.  It's called the Fibonacci sequence.  It doesn't take that long to start generating enormous numbers.   The fifty-second number in the sequence is 20,365,011,074.  That's more than all the people currently living on earth today.  The Fibonacci sequence is an example of  exponential growth.

And a fundamental attribute of exponential sequences is that they can grow to truly gigantic numbers even if the starting value is very small.  And the Fibonacci sequence is an example of an exponential "growth" sequence.  There are also exponential "decay" sequences.  A classic example of this is radioactive decay.

Radioactive materials have a characteristic called a "half life".  This is the amount of time it takes for half the original material to "decay" into something else.  Let's say we have a made up element called "Madeup", which is radioactive.  And let's assume its half life is one week.  And let's say we have exactly 33,554,432 atoms of pure Madeup at the beginning of our observation period.

Then exactly one week later we will have 16,777,261 atoms of Madeup.  (We'll also have the same number of atoms of whatever Madeup decays into as a result of the radioactive decay process.)  In two weeks we will have 8,388,608 atoms of Madeup left.  And so on.  In exactly 25 weeks we will be down to only one atom of Madeup.  All the rest will now be something else.  That's because 33,554,432 is exactly two raised to the twenty-fifth power.  It's the reason I picked it as my starting number.  It made the math simple.

So what's this all got to do with COVAD-19 and Epidemics?  Plenty.  There are a lot of numbers floating around about COVAD-19.  At this point it is hard to figure out what the correct numbers are.  But we don't need to know the exact numbers to understand the pattern.  Let's see what I mean.

There is no exact number for the number of days it takes for COVAD-19 to be transmitted from one person to another.  But it seems to be around a week.  If it is more than a week then the disease will spread more slowly.  If it takes less than a week then the disease will spread more quickly.

The same thing is true of the number of people an infected person turns around and infects.  The range most people quote is "two to three people".  Again, if the number is higher then the disease will spread more quickly.  If it is lower then it will spread more slowly.

But if we start with one infected person.  And we assume the transmission time is a week.  And we assume that each infected person infects two others.  Well, then twenty-three weeks later everyone on earth will be infected.  The US Census bureau estimates that the current population of the world is 7.6 billion.  That's less than the 8,388,608 number we get by doubling two 23 times.

The first COVAD-19 infection was more than 23 weeks ago and not all of the world has since been infected.  So my simple "one week results in two new infections" rule is wrong.  Let's see if we can get more real.

The number of people a specific infected person infects varies from zero to a whole lot.  A completely correct mathematical model would be very complicated.  It would involve statistics and calculus.  But there is a "typical" number of infections that gets you to roughly the same place that the super-precise analysis does.  And the important thing is that the fewer "average number of new infections" we see, the slower the infection spreads.

The same is true of the transmission time.  The actual situation is very complicated.  But a single, simple "average transmission time" will get you to pretty much the same place the complex analysis does.  And again, the bottom line is the same.  If the average transmission time is short then the infection spreads quickly.  If the average transmission time long then the infection spreads much more slowly.

And that means we can characterize various changes in terms of their impact on the average number of infections and/or the average transmission time.  And, as far as I can tell, nothing much affects the average transmission time.  So the whole game is in reducing the average number of new infections.

A few places have taken drastic measures.  They have tested lots of people (in some cases everyone).  As soon as someone tests positive then they are immediately isolated.  This has the effect of driving the average new infection rate to zero (or a small number close to zero).

If we can get the average new infection rate down to 0.5 (two infected people together only infect one new person, on average) then we are talking radioactive decay mathematics.  If everybody in the world is infected (this example makes no sense in the real world, but stick with me, anyhow) then after 25 weeks we would be down to one new infection per week.  More realistically, if we assume that a million people are currently infected then it would take 20 weeks to drive the world wide new infection rate to 1 per week.

And things go more quickly if we can do better.  If we assume that we can drive the average new infection rate to 0.1 (only a 10% chance of an infected person passing the disease on to another person) then a million current infections per week goes to one new infection per week in only six weeks.

That's what health officials are trying to do, drive the average new infection rate as low as possible.  If the average new infection rate is greater than one then the number of new cases grows.  If it is less than one, then the number of new cases declines.  It appears that the average new infection rate is currently less than two but well above one.  And, if the best we can do is get it down to just below one, then the disease will linger for a very long time.

It seems to me that an obvious measure to take is to require people to wear face masks while in public.  The standard cheapo facemask does NOT completely stop the disease from spreading.  The virus can pass through the mask.  So an infected person wearing a mask can infect an uninfected person who is also wearing a mask.  But face masks drastically reduce the "viral load".

An infected person throws off a lot of virus particles.  But a single virus particle does not always succeed in transmitting the virus.  In fact, a single particle is almost always unsuccessful.  It's a numbers game.  If a lot of particles are transmitted from the infected person to the uninfected person then the chances of the uninfected person catching the disease goes way up.  Anything that reduces the viral load substantially increases the chances of a transmission failure.

The Chinese found that masks alone were not effective.  But I have to believe that they would help.  And let's say we adopted an "everybody has to wear a mask while in public' (and here I'm talking about the cheapo masks that are generally available) rule.  I think it would help.

And let's say that the average person went through ten masks per month.  The world would then need to manufacture eighty billion masks per month indefinitely.  That's doable.  But nobody is talking about doing that, at least not in the US.  So it hasn't happened here.  But, even with the low current rate of use of cheapo masks in the US, there is a severe shortage of them.

And there is a mask that is pretty effective.  It is something called an N-95 mask.  They are more expensive (and harder to use as they must fit tightly to work properly).  At this time the thinking is that their use should be restricted to health care workers and others in high risk categories.  There is a shortage of them too.

The strategy that is being rolled out instead is "social distancing".  The most efficient way to spread the disease is for an infected person to exhale a swarm of virus particles.  They then travel a short distance before being inhaled by an uninfected person.  COVAD-19 is a lung-based disease so getting a lot of fresh virus particles (a large viral load) into an uninfected person's lungs is what will most effectively transmit the disease.

If the uninfected person is six feet or more away from the infected person the chances of any particular virus particle ending up in that person's lungs is much reduced.  Again, its a numbers game.  The more hoops you make the virus particles jump through, the fewer make it into the lungs of the uninfected person.  Fewer particles (a smaller viral load) means a higher probability of the infection failing to take hold.

The other thing people are doing is cleaning everything.  Virus particles can survive in the open air for a while.  They can also land on a surface and stick there.  Survival time depends on the specifics of the surface.  But a wipe with any kind of disinfectant drives the survival rate to infinitesimal.  As does the UV component of sunlight.  Sunlight hitting free floating virus particles or those lodged on a surface will quickly destroy them.

There is similar thinking behind the "wash your hands", "don't touch your face", and "don't shake hands" admonitions.  The idea is that virus particles can end up on your hands.  Then you transmit them to your face.  Then somehow they get to your lungs from there.

All of these behavioral changes can be helpful.  But I think the amount of help they represent is modest.  I suspect that abandoning all these practices and replacing them with a "mandatory facemask while in public" rule would be more effective.  But then what do I know?  And doing both could help and can't hurt.

A very grave concern of many experts is the effect the disease can have on hospitals and the medical system.  Italy is seeing large numbers of deaths.  Part of the reason is that in a large part of the country the health care system got completely overwhelmed and became unable to care for people.

Like ours, the Italian health care system was sized to handle the normal amount of business, or maybe a little more.  What it got was a whole lot more.  At that point they ran out of everything:  doctors, nurses, medical supplies, intensive care beds, regular care beds, supplies of all kinds, support staff of all kinds, the works.

It is important to react strongly and to react before the crisis hits.  Capacity must be "surged" before it is needed.  By the time it is needed it's too late.  China reacted by building two large hospitals in less than two weeks.  Were they full up hospitals?  No!  But they were capable of handling the patients that weren't acutely sick so that the regular hospitals could focus on the acutely sick.

The US is in the same situation.  We currently have enough hospital capacity to handle the need.  But we have little "surge" capacity.  You need to ramp up in a big way at least three weeks before the wave hits.  Remember, if the case load is doubling each week, you will have eight times as many patients to deal with in three weeks.

Let's say that the number of people with the disease who are sick enough to require hospitalization is currently absorbing a quarter or more of your surge capacity.  If so, if you don't act immediately, you are guaranteed to be in a world of trouble in three weeks time.

People have been studying epidemics for a long time now.  The classic epidemic is the "Spanish Flu" epidemic of 1918-19.  It demonstrated all of the characteristics of the current COVAD-19 one.  Oh, the average transmission time and the average new infection rate were different from then to now.

But, from a mathematical perspective, they are identical.  To model COVAD-19 you just plug the new average transmission time and the average new infection rate into the exact same model and the time line for what is going to happen when pops right out.

And it is important to point out that the actions that are currently being taken are all about "flattening the curve".  We know we can slow the rate at which the epidemic grows.  This is important.  Slower growth gives us more time to prepare.  But we need to use that time to actually prepare.  So far the time has mostly been squandered.

But let's say we get our act together.  Then what?  We currently have no way to prevent the disease and only modest ways to treat it.  Current treatment consists of supporting the body's natural ability to fight off the disease.  Reducing the amount of virus the body has to deal with makes it easier for the body to triumph.  It appears that the younger you are the better the body is at fending this particular disease off.  That's unusual.

Most of the people who have died so far, at least in the US, have been elderly people and people who have compromised immune systems.  I live in King County in Washington State.  It has been the epicenter of the early evolution of the disease in the US.

Today the local paper published a graph that broke out the age of the various people who died in my state.  59% of those who died were 80 or older.  A further 26% were 70-79.  A further 16% were 40-69.  No one under 40 has died.  (You can find the graph in the March 20, 2020 edition of The Seattle Times.)

But that's all besides the point.  Slowing things down does not ultimately reduce the number of people who eventually get the disease.  The only current sure-fire protection is to get the disease and live to tell the tale.  Once you've had the disease, you can't get it again.  Or at least, that's what everybody currently believes.  If we wait long enough and let enough people catch the disease then eventually "herd immunity" will kick in.

If almost everyone, say 90%, has already had the disease then the disease will have a hard time finding an uninfected person.  And that drives the new infection rate close to zero.  The current population of the US is 329 million, according to the US Census bureau.  To get to 90% just under 300 million people would have to get the disease.

Our current medical system is in no way ready or able to cope with that.  If just 10% of people who get infected require hospitalization, that's 30 million people.  If we flatten the curve enough to spread the epidemic over ten years that's still three million hospitalizations per year.  That smaller number is still large enough to overwhelm our current medical system.

Right now, the percentage of infected people expected to need hospitalization is higher than 10%.  But we are still undertesting so it is likely that we are undercounting the number of people who are infected.  So maybe the actual percentage of infected people who need hospitalization is 10% or even less.  It would have to be a whole lot less to get the expected number of hospitalizations down to what our system can handle.

Currently Washington State has 1,376 known infections and 74 known fatalities. That's a fatality rate of over 5%.  But there are special circumstances that lead many experts to suggest that that number is too high.  The early number out of China was 2%.  The latest number is 1.4%.  Let's say that even that number is too high and the real number is 1%.  That means that to get to the point where 90% of the US population has had the disease we can expect 3 million fatalities.  That's sobering.

And one hopes that it doesn't come to just having to depend on herd immunity.  The two alternatives to be hoped for are an immunization or an effective treatment (or both).  If people can get a shot (or take a pill, or whatever) and become instantly immune without having to get the actual disease, that would be ideal.  It would mean that we wouldn't have to worry about overwhelming our medical system and a lot of people wouldn't have to die.

But it is important to know that coronaviruses, the group of viruses that COVADS-19 is a member of, also contains the viruses that cause the common cold and the "seasonal" flu.  Pharmaceutical companies know that an effective cure for the common cold would be a giant gold mine.  So they have vigorously pursued a cure for decades with no luck.

They have had more luck with seasonal flu.  They have been successful at creating a flu shot that protects against specific strains of flu.  But the flu just mutates into something that the flu shot is ineffective against, necessitating the creation of a new and improved flu shot.  So pretty much every year I get the flu even though I always get that year's flu shot. Companies are trying to come up with a "universal" flu shot but have yet to see any success.

There has been more success on the treatment front.  Almost nobody dies of the flu or of a cold.  Very few people need hospitalization.  It looks like it will be easier to come up with something that works on the treatment side.  But expect success to be hard to come by.  That's what dealing with colds and flus tells us.

So the best thing we can do is to pour resources into medical research.  I'm talking doubling (or more) the amount we are currently spending in these areas.  That should speed up the process.  And, while we are looking for something that is effective against COVAD-19 we might get lucky and find something effective for dealing with colds and/or the flu along the way.

But it is critical to be realistic about how fast some kind of "fix" can be delivered at scale.  Here are the steps involved.  Step one is for a scientist to find something interesting in the lab.  Step two is to do a "safety" test on humans.  Step 3, is to do a small scale "effectiveness" test.  Step four is to do a large scale effectiveness test.  Step five is regulatory approval.  Step six is large scale manufacturing.  Step seven is large scale implementation.

In normal situations one or more of these individual steps can easily take a year or more.  And it doesn't pay to skimp.  A drug just failed step three when applied to COVAD-19.  Other drugs for other diseases have spectacularly failed step two.  Killing ten million people in an attempt to save one million people is a bad idea.

No one knows how long step one will take.  If a drug that is already approved for use on humans to treat another medical condition turns out to be effective against COVAD-19 then some of the above steps can be implemented at warp speed.  But it will likely take something brand new.  And even with a lot of luck and super-fast-track-ing everything, the entire process is likely to take 12-18 months at a minimum.

If you want to know why a lot of knowledgeable people are currently shouting PANIC at the top of their lungs, now you know.  And it's not even "just science".  Instead, it's "just math".

Thursday, February 20, 2020

A Proper Argument

Very recently it was vigorously brought to my attention just how far out of the mainstream I am.  I have views on how to properly conduct an argument that are at variance with a lot of people.  That is perhaps not surprising.  But I find that I hold views that are at variance with pretty much everybody.  Someone whose thinking I thought was not wildly different from mine turned out to in fact be wildly different from mine.  That was both disappointing and deeply distressing.

I have put more than fifty years of thought and effort into my thinking on the subject of how to determine what's true and what isn't.  I have tried very hard to figure out what works and what doesn't in the context of this pursuit.  I think everybody should value the truth and am disappointed when I come across people who don't.  But it turns out my focus was too narrow.

A lot of my blog posts over the years have been attempts to correct the record.  If there is something floating out there that I think is wrong and others are doing a good job of getting the correct information out there, I leave it to them.  I try to stick with situations where there is either insufficient effort being made to correct the record or where everybody seems to be missing something important.

And I am actually humble when it comes to a monopoly on the truth.  I screw up all the time.  But I figure that if I have gotten something wrong then the only way someone can set me straight is for me to say what I think.  That way people know where I am off base and, therefore, need correction.  I take it as a plus when somebody sets me straight about something I have gotten wrong.

This seems to me to be a reasonable way to approach life.  And I know all about the white lie designed to soothe someone's feelings.  I know that a well placed white lie can smooth out many a social situation.  I am just bad at it.  I know this often holds me back in social situations.  I would dearly love to be much better at it.  I have just never been able to find a way that I can consistently pull off.

But there are social situations and there are social situations.  I try to not be abrasive in purely social situations.  But what about a discourse on the issues of the day?  Is disagreement permitted in these situations?  I would have said the answer to this question would be "yes".  Apparently, I am wrong.

There is a lot of discussion of "echo chambers" and "people talking past one another".  This is universally decried as being a bad thing.  I agree.  But what's the remedy?  Before going into that, at the risk of coming off pedantic, let me restate the problem.  The problem is that disagreement is not allowed.  Beyond that, no one directly engages with the other side's arguments.

The "fix" now becomes obvious.  People should stop engaging in the problematic behavior.  People should be allowed to disagree not only with what the other side says but with what their side says.  Further, both sides should understand and engage with the other side's argument.  And all disagreements should be with the argument, not the person making the argument..

I don't think there is much disagreement with anything I said in the previous paragraph.  (I will go into why there is not across the board agreement below.)  I have now outlined exactly how I proceed.  And I am in trouble for doing so.  Before continuing I am going to make a digression.

Lots and lots of people have outlined roughly the same "fix" as the one I outlined above.  But far too often they add something.  And this is most common when politics is being discussed.  They say some variation on "both sides do it".  This is misleading.

It is true that to some extent both sides do this.  But one side does it a lot more than the other side.  This "both sides do it" argument provides cover for the side that is doing it the most.  They can say "we are only doing it because they are doing it".  I don't think that's true, but as a tactic for getting off the hook, it works great.

Now I could be wrong.  When engaging with this "both sides do it" claim I say "here's why I believe they do it a lot more than we do it".  All you need to do to destroy my argument is to provide evidence that my claim is false.  But nobody ever does that.  Instead they get mad at me.  In other words, they treat me as being on the other side then they fail to engage with my argument.  I have a blind spot.  I am always surprised when this happens.

I think having the argument is critically important.  So there need to be "rules of engagement" for how to conduct a proper argument.  The rules I try to follow are:
  • Understand your argument and the evidence that goes along with it.
  • Understand the other person's argument and the evidence that goes along with it.
  • Engage with the evidence, the data and analysis.  Do it to both sides' argument.
  • Do not confuse the argument with the person who is making it.
Stated this way, I think most people would agree.  But I find that often people don't behave that way.  I find the last item critically important.  I never confuse the argument with the person making it.  But this concept is honored in the breach far more often than I thought it was.  I wasn't expecting that.

I very carefully separate the arguments from the person making it.  Just because I disagree with an argument someone had made I don't think they are a bad person or stupid or ignorant.  I just think that in this specific case they have gotten it wrong.

On the other hand, maybe I have gotten it wrong.  If you point out the error of my ways then I am better off for it and that's how I see it.  I am well aware that not everybody operates the way I do.  But I still think it's the best way to operate and I am disappointed when someone who I thought operated that way doesn't.

And I know a big source of my divergence from the norm.  I spent a lot of time interacting with computers.  To state the obvious, computers are not people.  I find that I get along much better with computers than I do with people.  Computers play by rules I am comfortable with.  People often don't.

Computers are good at giving you instant and unambiguous feedback.  I will write and run a computer program.  It will either behave the way I want it to or it won't.  Here's the thing.  Computers don't hold grudges.  If I run a program and it messes up badly the computer, in effect, says "here's the story".  I look at it and try to figure out what I did wrong.  Then I fix it and try again.  The computer doesn't remember what happened last time.  It just notes what happens this time.

I have gone through this "try - fix" cycle so many times I long since lost count.  In each case I soldiered along until the program did what I wanted it to.  And the computer is a neutral arbiter.  It just follows the instructions my program contains and lets me know what happens.  It does not denigrate my looks or ancestry.  It just does what I tell it to.  If I told it right then the right thing happens.  If I told it wrong then the wrong thing happens.  But the computer doesn't even venture an opinion with respect to the right or wrong of what it was told to do.

I flourished in that atmosphere.  I never took it personally when the computer told me I got it wrong.  I just dug in and tried to do better next time.  I was also okay with not receiving praise from the computer when the program worked.  In short, there was an implicit "nothing personal" about how the computer behaved.

So, what's all this have to do with a proper argument?  Just this.  Computers taught me to get comfortable with criticism of my argument/program and to not take it personally.  In the real world, there is the argument and the person that is making the argument.  They are two different things.  Even though most people don't have the computer background I have I thought thoughtful people knew that.  Apparently I got that wrong.  Silly me.

I have no problem separating the person from the argument they are making.  Maybe it's my computer experience.  Maybe I am just wired that way.  But it just seems so obvious to me that I don't continuously say anything about it.  I think objecting to an argument is NOT objecting to the person making the argument.  But apparently way more people than I thought always see objecting to an argument they are making as some kind of personal attack on them.

It would be nice if this didn't matter but it does.  The Greeks made a distinction between "logic" an effort to determine what is right and true, and "rhetoric", the best tactics to use if you want to win an argument.  Their study of rhetoric focused on what was effective.  But along the way they identified both fair and foul ways to be effective.

If "winning is the only thing" then, by all means, use whatever works.  (These are the people who would not go along with the list of principles I outlined above.)  But we should all be able to identify when someone is using one of those foul means to advance their position.

One of the most common foul means is called the "ad hominem" argument.  "X" and "Y" have a difference of opinion.  "X" says "I'm right because of blah, blah, blah".  "Y" says "X is a bad person so you don't have to pay any attention to what he said".  If a person quickly resorts to ad hominem arguments I assume they are in the wrong unless I see substantial evidence to the contrary.

But, since nothing is ever as simple as I would prefer, sometimes an ad hominem argument is justified.  If a person says "I'm right because I'm and an expert and I have studied the situation carefully" but an opponent presents evidence that the person is not an expert and has not studied the situation carefully, then it is appropriate to take the characteristics of the person making the argument into account.  This all assumes, of course, that the opposition provided credible evidence to back their claim up.

Ad hominem attacks are deployed in order to avoid engaging with the meat of a person's argument.  Unsubstantiated or easily disproven ad hominem attacks are the worst.  They should be routinely denounced.  But this almost never happens.  Instead, we are subjected to ad hominem attacks all over the place.  I try my best to make things better, not worse.

The problem is that in the present environment, bad behavior works.  The most generic version of this sort of thing is called "going negative".  When someone runs for public office they should advocate for their positions and qualifications.  If they instead say "my opponent is a bad person", that's going negative.  And this kind of attack is often extended to "my opponent and all of his supporters are bad people".

A couple of generations ago "going negative" was widely derided.  But it worked and it kept working.  It turns out that voters are happy to support a candidate who go negative.  When it became apparent that going negative was effective everybody started doing it.  I never liked going negative but that is an argument I lost a long time ago.  How long ago?

An early proponent of going negative was Richard Nixon.  He used it successfully to get himself elected to the US House of Representatives.  He later used it to a lesser extent to win a Senate seat and then a slot as Vice President on a winning ticket.  In 1960 he decided to run for President.

He also decided to run a positive campaign.  He was obviously more qualified than his opponent, a relatively inexperienced Senator named Kennedy.  So why not win fair and square?  He lost.  If you look at the debates they engaged in, you will find that their positions were little different.  And conventional wisdom had it that Nixon won the debates if you talked to people who heard them on the radio.  But TV viewers gave the nod to Kennedy.  He looked handsome and confident.  Nixon looked swarthy and untrustworthy.

People didn't decide based on the quality of the candidate.  They decided based on likability and personality.  Nixon also didn't go negative when he ran against a far less well qualified candidate to become Governor of California in '62.  He lost again.  In '68 he went back to his "tricky dick" tactics and won the Presidency.  He won big in his reelection campaign in '72 by using even less savory tactics.

It is hard to fault Nixon for reverting to type.  Playing fair was not a successful strategy for him.  It was the voters who decided what worked and what didn't.  He just decided to go with what worked.  For a while the thought was that Nixon was an outlier.  But then more and more candidates went negative and won as a result.  Voters decided that going negative was okay.  If they had decided otherwise we would now be in a far different place.

I have known for a long time that going negative works when it comes to elections.  But that hasn't meant that I liked it.  And it has not worked when it comes to my vote.  But I do confess to being typical in that I make my decisions based on many factors.  I don't just go with the candidate that is the most honest or the most competent.  I do, however, accord those factors a lot of weight.  But elections are not the only thing we argue about.

This is not my first run at this subject. Back in 2014 I wrote a blog post called "Faith Based Conflict Resolution".  Of all my posts, it is the one I am most proud of.  Here's a link to it:  http://sigma5.blogspot.com/2014/12/faith-based-conflict-resolution.html.  Looking back at it I find that I was too optimistic.  I just assumed the whole business about separating out that argument from the person making the argument was commonly accepted and just focused on the mechanics of the argument.  Before moving on, here's the meat of the argument:
Ultimately the only tactic that is effective in this environment [a "faith based" environment] is the power tactic.  And do we really want to decide all conflicts by a test of power?
A little later I partially answered that question.  I pointed out that my preferred approach, the scientific one, frequently leads to embarrassment.  Then I said:
Well, there's the whole "inconvenient" thing.  In the world of science it is frequently true that everybody is wrong.  An outcome where everybody is wrong is the only one that is worse on our egos than an outcome where we are wrong.
 I knew that this approach would not appeal to everyone.  After all, some people are more interested in being on the winning side than they are on getting the facts or the tactics right.  But I truly believed that there were lots of people who shared my "facts first" attitude.

But the whole "how should conflicts be resolved" issue presupposes that that it is possible to go about the business of disagreeing without it instantly and inevitably turning personal.  Lots of people are comfortable engaging in ad hominem attacks.  Turning all disagreements into something personal is something they are comfortable with.  Apparently more people are comfortable with ad hominem attacks than I thought.  That's bad.  I still think it is important to be able to disagree without it getting disagreeable.

So is all lost?  Actually, no.  I take hope from the most unlikely of sources, sports.

People take their sports and their favorite teams very seriously.  And you don't have to look far to find examples of fans getting totally out of control.  But mostly the opposite is true.  Sports bars are everywhere.  And they serve alcohol, which usually makes things worse rather than better.  But things getting out of hand is actually the exception rather than the rule.

On any day in any city you can find lots of sports bars full of rowdy fans.  And many of these bars are populated by heterogeneous groups.  One group consists of fans for one team or athlete.  Another group consists of fans of another team or athlete.  And they are often very vocal when it comes to their opinion.  And large quantities of alcoholic beverages are consumed.  But at the end of the day almost all of these rowdy fans go home peacefully and quietly.

This actually happened to me.  Many years ago my then girlfriend and I visited the "Cheers" bar in Boston.  Locals take the Sox very seriously and there was a game on between the Sox and the Seattle team when we arrived.  When patrons found out that we hailed from the land of the enemy they derided our team and exalted theirs.  But then the Sox lost quite unexpectedly.  Things could have gone south at that point but they didn't.  Instead, all sides were good sports about it.

So what's going on?  I'm not much of a sports fan.  But I do routinely skim the sports section of the paper.  You know what it's full of?  Facts and data.  Sure, there are opinion pieces.  But page after page is full of box scores, statistical breakdowns, and all kinds of detail about teams and players.  And ask the typical fan in the typical sports bar.  They can reel off statistics and figures until your eyes cross.

Sports fans are deeply knowledgeable about their passion.  Couple that with an unambiguous result.  This team or player won or lost.  The score was whatever.  Modern day sports coverage is deeply analytical.  And that means that sports fans are intolerant of BS.  Even the opinion columns have to back up their opinions with facts and data.  Fans get into arguments with other fans all the time.  But "'cause I say so" just doesn't cut it.

And, while a lot of trash talk goes back and forth, no one gets upset by it.  At the end of the day it's mostly "no hard feelings" and "see you at the next game" rather than "I now hate you from the bottom of my heart".  Sports fans, even drunk ones, have mastered the art of separating the argument from the person making the argument.  That makes them role models of a kind we badly need.

And I think the fact that sports and sports coverage is now so data driven that is a major contributing factor.  Michael Lewis wrote a book called "Moneyball" way back in 2003.  The book discussed something called "sabermetrics", an effort to replace emotion with data when it came to evaluating baseball players.

Baseball fans will no longer tolerate a team that doesn't adopt a sabermetric approach.  And many other sports have since adopted similar approaches.  Fans now demand no less.  A team that now tries to take a "seat of the pants" approach can count on such a decision being greeted with scorn and derision from their fans and from the press.  So sports and sports fans have adopted a scientific approach to their fandom.

Sports is definitely the better for it.  And sports betting is about to become ubiquitous.  It will soon be easy for a fan to lose a lot of money by betting from the heart rather than from the head.  And this will provide additional inducement for fans to behave responsibly.

Sports is supposed to be less important than politics.  But more people invest more time and effort in sports than they do on politics.  Unfortunately, it shows.  Politics would be better off if it adopted the kind of data driven approach that is now common in sports.  Where's the call for a "sabermetrics of politics"?

And people who are not that into sports need to behave more like sports fans do.  Remember!  You heard it here first.

Friday, February 14, 2020

60 Years of Science - Part 16

This post is the next in a series that dates back several years.  In fact, it's been going on so long that I finally decided to bite the bullet and update the title from "50 Years of Science" to "60 Years of Science".  Same series, just an updated title.  And, ignoring the title change, this is the 16th entry in the series.  You can go to http://sigma5.blogspot.com/2017/04/50-years-of-science-links.html for a post that contains links to all the posts in the series.  I will update that post to include a link to this entry as soon as I have posted it.

I take Isaac Asimov's book "The Intelligent Man's Guide to the Physical Sciences" as my baseline for the state of science when he wrote the book (1959 - 60).  With the new year it is now fully sixty years since the book came out.  In these posts I am reviewing what he reported then noting what has changed since.  For this post I am moving on to a chapter he called "The Waves".  I will be reviewing two sections:  "Light", and "Relativity".

Light is fundamental.  As Asimov notes, the first words in the bible are "let there be light".  But for a long time the nature of light was a complete mystery.  Two early ideas were that it was emitted by objects and that it was emitted by the eye.

CGI, Computer Generated Images, now a staple of the movie making business, was not a thing in Asimov's time.  A single CGI shop now has more computer power than existed in the entire world until some time in the '80s.  But one of the techniques employed by CGI is called "ray tracing".  And one way to do ray tracing is to start with the eye of the viewer and trace light paths back to the virtual objects in the CGI image.  So the latter idea is not as nutty as it now sounds.

Little was known about light.  It traveled in straight lines, hence ray tracing.  When it was reflected, say off of a mirror, the angle of the reflected light was equal but opposite to the angle of the incident light.  The transition between materials, say from air to water, caused light to bend or "refract".  That was pretty much it until Newton came along.

Newton published the results of his experiments on light in a book called Optics.  Unlike Principia, Optics is easily understood by regular people.  The experiments he performed and analyzed are clearly described and elegantly analyzed.  This is the complete opposite of the situation that I found when I dived into Principia.  In Optics, it is easy to follow along with him and nothing he has to say is hard to understand.

Newton investigated light's characteristics by completely covering the windows in a room.  Then he poked a small hole in the covering, thus letting a narrow beam of sunlight enter his now darkened room.  The then placed objects, primarily lenses and prisms, into the beam to see what happened.  Using this simple and easy to understand (and reproduce) approach, he was able to determine many of the properties of light.

Both the lenses and the prisms bent light.  And, in the prism's case, it broke light up into a spectrum of colors.  Water droplets in the air do the same thing.  The result is a rainbow.  Lenses curve light so that it either converges to a point or diverges to a band much wider than the original sunbeam.

Newton proved that sunlight is actually composed of a mix of a whole lot of different colors.  He was even able to break light apart into its component colors and then put the colors back together again.  He did this by first guiding the sunbeam into a prism, which broke the light into colors.  He then guided the output of the first prism into a second prism that had been turned the other way.  This reassembled the colors back into white light.  He also observed that the degree to which a lens bent light depended on the color of the light.

All these and many others (I am just skimming the surface of what he so clearly lays out in Optics) make light sound like it is made up of waves.  Nevertheless, Newton concluded that it was actually composed of tiny particles he called "corpuscles" that traveled at very high speed.  (He decided that refraction was caused by a speed change in light corpuscles as they transitioned from one medium, say air, to another, say water.)  This "corpuscular" idea set off a battle over whether light was made up of particles or waves,  That battle took hundreds of years to resolve.  Moving on, . . .

Huygens was an early proponent of the "wave theory".  Waves have a "wavelength", the distance from one peak to the next.  If various colors of light have different wavelengths then many of the attributes of light can be explained.  Refraction, the bending of light, and the color dependence of refraction (light of different wavelengths is bent more or less, depending on its wavelength) could be explained this way.  But particles don't have a wavelength, or so everybody thought.

But the wave theory of light had problems, which I am not going to go into.  The wave people could knock holes in the particle people's analysis.  And the particle people could knock holes in the wave people's analysis.  Both sides believed that the holes in their theory could somehow be patched up but the wholes in the other side's theory were fatal.  So the battle continued until new ideas were introduced.

One experiment that tilted thinking toward the wave theory was the "double slit" experiment pioneered by Young.  Light is passed through two narrow slits.  After that it strikes a screen forming a pattern.  It is easy to do an experiment with waves in a water tank or guns and a target.  One shows the pattern expected if light is waves.  The other shows the pattern expected if light is particles.

The "two slit" pattern shouted "waves".  The experiment was easy to do.  So lots of people tried various adjustments.  The variations allowed the computation of the wavelength of various colors of light.  The numbers turned out to be extremely small.

Fresnel was the first to show that if an object was about the same size as the wavelength of light (bacteria turn out to be too big) then a "diffraction" pattern results.  (His ideas also resulted in the creation of "Fresnel lenses".)  So the particle theory of light is dead, right?  Not so fast.  But first a digression (by scientists, not me).

Now that we know the wavelength of light it should be possible to determine the speed.  Galileo was the first to try.  Flashing lamps from the tops of hills, even hills that were miles apart, didn't work.  What did work was carefully studying when various moons of Jupiter got eclipsed.

Newton had provided a way to very precisely calculate orbits so the expected eclipse times could be very accurately calculated.  Careful observation by Roemer looking for moons eclipsing earlier or later than Newton said they should provided a number, 192,000 miles per second, that is not far off the true number.

Now that they knew what they were up against others were able to bring things down to earth.  If you shine a light between the teeth of a disk that is spinning very fast you can detect extremely small time differences.  Fizeau did just that in 1849.

Foucault introduced some clever modifications that allowed him to come up with a speed of 187,000 miles per second.  His technique was precise enough that he was able to get different results if light traveled through different materials (water versus air, for instance).

Michaelson added more improvements and measured the speed of light in a vacuum as 186,282 miles per second.  In Asimov's time "atomic clocks" and "masers" (the predecessor to lasers) were available.  This degree of accuracy permitted light to be used to measure distances.  In Asimov's time this trick could only be used to measure astronomical distances, millions and billions of miles.

Today we can use it to measure "down to earth" distances.  The speed of light is roughly one foot per nanosecond (billionth of a second).  It is now easy to count nanoseconds and, depending on how much money you have and how much effort you want to put into it, much shorter time durations.  So measuring distances of a few feet using light delay is now easy.  That's how GPS works.  And smartphones can easily do GPS.

If light is a wave the question becomes what's waving?  Sound waves cause air to move.  What's moving in the case of light?  How about something called the "luminiferous aether"?  (This was often short-handed to "ether".)  Let's say some kind of "ether" permeates everything and light works by vibrating it?  This sounded reasonable so scientists went looking for it.  The stuff turned out to be quite elusive.

But its fundamental property was that light propagated through it.  So it should be possible to detect it by carefully measuring the speed of light in multiple directions.  (You can calculate what direction and speed air is moving in by very accurately measuring the speed of sound in multiple directions.)

The thinking of the time was that the ether was fixed in space and the earth moved through it.  The speed of the earth was tiny when compared to the speed of light.  But Michaelson had refined his procedures to the point where it should be detectable.  And everybody knew that the earth moved.

He teamed up with Morley and started making measurements using something called an "interferometer".  The problem is that no matter how hard they looked the speed of light turned out to be the same no matter what direction you measured it in.  If the earth was moving though the ether this was impossible.  Oops.

Newton had developed the idea of a "preferred frame of reference" in Principia.  The idea was that in some sense the universe did not move.  He showed how to translate measurements in one frame of reference to another frame of reference in simple situations.  But he always assumed that there was such a thing as a fixed frame of reference that wasn't moving.  It was very hard to square the Michaelson/Morley results with the ides of a fixed, preferred frame of reference.

The ether was supposed to provide the proof that such a frame of reference existed.  But the experiment that was supposed to once and for all demonstrate the existence of the luminiferous aether failed completely.  In Asimov's time the same experiment cold be performed with a much higher degree of accuracy.  The results were the same.  We can now do it far more accurately than was possible in Asimov's time.  It still fails.  And that failure led to the subject of Asimov's next section, "Relativity".

The first step in moving from what we now call "Classical" or "Newtonian" mechanics was taken in 1893 by Fitzgerald.  He posited that space "contracted" in the direction of motion.  This process became known as "Fitzgerald contraction".  Mathematically, the idea was a great success.  It used a simple mechanism to exactly match experimental results.  Since the fundamental stuff of the universe was affected it meant there was no experiment that would detect it.  That was unsettling.

A side effect of Fitzgerald's work was that, if what he was saying was true, then the speed of light in a vacuum was a universal speed limit.  Nothing could go faster.  That was perhaps even more unsettling.  And Lorentz extended Fitzgerald's work by saying the mass a a particle traveling at neat the speed of light would increase.  In fact, it would go to infinity if it actually reached the speed of light.

This provided a mechanism for enforcing the speed limit.  F=MA, Newton's old formula, was how you "accelerated" particles.  If the Mass went to infinity then the amount of Force necessary to provide that last scintilla of Acceleration would also go to infinity.  Since infinite Force was not available, acceleration all the way to the speed of light was impossible.

All this seemed totally nuts at the time.  (It still does.)  But results like these made it harder and harder to argue that were now called the "Lorentz-Fitzgerald equations" were not only nuts but wrong.  And then there was the annoying fact that all the sensible ideas has been conclusively proved wrong by this point.  It was a good thing that experimental results came to the rescue just when they were needed most.

As noted above, there was no experiment that could detect the Fitzgerald contraction.  However, there were experiments that could be done to detect the absence or presence of the Lorentz effect.  Electrons could be accelerated to very high speed.  And the mass of a fast moving Electron could be measured.  Kauffman did the experiment in 1900.  The Lorentz effect was real.

The "real world" that Newton had explored looked sensible.  It looked "natural".  This world that scientists were now uncovering looked truly weird and very unnatural.  If what was "natural" was that which conformed to the experience and intuition of people going about their every day lives, then scientists' understanding of how the "natural world" worked was diverging more and more as the twentieth century unfolded.

If the results of the Michaelson/Morley experiment had been all that scientists were coping with, that would have been one thing.  Unfortunately for fans of the old understanding of "natural", there was more, much more.  Another problem cropped up almost immediately in what seemed to be an entirely unrelated place.

We are all familiar with the fact that when you heat something up it often glows.  And you can roughly estimate its temperature if you know what the material is and what color the glow is.  For good but obscure reasons to be explained below, scientists call this the "black body problem".

The color/temperature problem can be divided into two parts:  the type of material and the color. We can assign a magic number to the type of material.  If we back this number out of the calculation then the rest of the problem always looks exactly the same.

So scientists picked a mythical "black body" as their name for the "always the same" part.  They then developed tables of magic numbers for specific materials.  They could then back this number out and consult their "black body" calculations for the rest.  That greatly simplified the search for a theory to explain the behavior of their mythical black body.

Black body theory came together quickly after that.  If heat was vibration then they had a formula for translating that vibration into color.  Temperature X should produce color Y.   And it worked, mostly.  But the actual situation was more complex.  Materials did not all vibrate at the same frequency.  Instead there was a frequency distribution.  That resulted in a color distribution.  But there was a reference temperature and a reference color so everything could be tied together.

And the main part worked.  Experiment tied a reference temperature to a reference color just like it was supposed to.  The problem was with the distribution.  It wasn't right.  The detail are complex so I am going to skip them.  Instead I am going to cut to the chase.  A man named Max Plank came up with an idea that fixed the distribution problem.  It's just that his solution was one of those "worse than the problem" solutions.

He decided that the energy involved was "quantized".  It was natural to think that things were vibrating at every frequency, more at this frequency and less at that frequency.  But there would be some vibration at every frequency, even if it was not much.  Plank said "no".  Only certain frequencies were permitted.  If you did the calculations based on this idea then everything came out exactly right.

The problem was that scientists could think of no reason why only certain frequencies were permitted while others were forbidden.  This was another step away from natural and toward weird.  Trust me, if scientists could have thought up something that worked and was natural, Plank's ideas would have immediately been discarded.  But they couldn't.

Plank's idea was extremely simple.  He said there was a fundamental unit of energy he called a "quanta".  Everything had to be done in quanta or exact multiples of a quanta.  It turns out that Plank's quanta is extremely small.  So color or temperature can take on a lot of values.  As a result things look like a smooth or continuous variation is present.

It's only if you look hard that you see that things are actually not smooth.  And the fact that the effect of the quantization of black body radiation is only apparent when you look very closely is why it was not initially apparent.

But that didn't make quantization any less necessary in order for the math to work.  And if it had only been this one small corner of physics that got the quantum treatment then we wouldn't be talking about it.

But this quantum business turned out to be ubiquitous in the world of the small, the world of atoms and subatomic particles.  (That's why the field is now called "quantum mechanics".)  It's now almost impossible to get away from it.  And that means that every part of the world of the small is weird -- really, massively, seriously, weird.

Plank's quantum theory was announced in 1900.  At first it didn't make waves.  Nobody liked it.  Everybody wanted it to go away.  But after Einstein published several papers in 1905 it was too late.  Einstein attacked a couple of seemingly different problems.

One is called the "photoelectric effect".  If you shine a light on the surface of a metal you expect it to kick things like electron and photons of light loose.  That happened but it didn't happen the way people thought it should.  It was another distribution problem.

The details aren't that hard to understand but it would take too long.  So, I am again going to cut to the chase.  Einstein, in one of his 1905 papers, applied "quantum theory" to the problem and out popped a solution that exactly matched the experimental results.  (He later got a Nobel prize for this paper and not Relativity.)  All of a sudden, this "quantum" business was a lot harder to ignore.

Speaking of Relativity, in another paper published in 1905, he introduced what we now call "Einstein's theory of Special Relativity".  In it he introduced the concept of the "Photon".  A Photon sounds like a particle and under some circumstances Photons act like particles.

But a Photon also has a wavelength so Photons act like waves in other circumstances.  In reality, a Photon is neither pure particle nor pure wave.  It has some attributes of either and some attributes of neither.  It's just its own thing.  And, by the way, photons are quantized.

This reformulation of how light worked into this entirely new thing, the photon, allowed Einstein to provide a single coherent explanation for all that was then known about light.  Since everybody -- well, all the scientists working in the area -- had been tearing their hair out because everywhere they looked, they saw problems, it was hard to ignore what Einstein had come up with.

As part of Special Relativity Einstein turned something inside out in an unheard of way.  As I noted above, if you do the Laurence-Fitzgerald thing you come up with a reason why things can't go faster than the speed of light in a vacuum.  But this seemed true "purely as a practical matter".

Einstein turned this inside out.  He said it was a fundamental characteristic of the universe that nothing could move faster than the speed of light in a vacuum.  From that principle he showed how you could derive the Laurence-Fitzgerald equations.  They followed from the absolute speed of light limit.  It was not the other way around.

This inversion might have seemed unimportant.  But Einstein used his view of how things worked to show how a bunch of other things followed from it.  One of these things was the effect on time.  Until Einstein everybody assumed that there was something called "absolute time".

Time worked the same everywhere, right?  It might be hard to synchronize clocks in two places but that was just a practical matter.  If you got it right you would see that all clocks in all places could be used to calculate the time and the time would be the same everywhere.

Einstein said the fact that time didn't always flow at the same speed meant that properly functioning clock didn't always run at the same speed.  He expanded the Lorentz-Fitzgerald equations to include time as well as space and mass.  He then showed how to translate from one frame of reference to another.  There was no such thing as an "absolute frame of reference".

Inherent in the idea of an absolute frame of reference was the idea of absolute time.  But if time could be sped up or slowed down then there was no such thing as absolute time.  And that meant that there was no such thing as an absolute frame of reference.  All frames of reference were always relative.

Einstein's Special Relativity equations showed how to translate between any two frames of reference as long as neither of them was accelerating. In Asimov's time there was still carping in the scientific community about this whole business of time speeding up and slowing down.  It just seemed so unnatural.  There was some evidence that the speed up - slow down was true at that time, but only some.

Now we can measure time much more accurately.  This pertains to both long and short periods of time.  As a result we can easily measure time with enough accuracy to confirm that it behaves exactly as Einstein predicted.  The most obvious example is GPS.

GPS satellites include code to adjust for the fact that they move around the earth at a relatively high speed.  Ignoring this "relativistic effect" would quickly cause the GPS system to get the time wrong.  And that would produce easily detectable location errors.

Moving from the practical to the esoteric, scientists now have access to clocks that are so accurate that raising one of them a single additional foot above the ground is enough to make a measurable change in how fast time flows.  Proof of the veracity of Special Relativity is now unavoidable.

Ten years later in 1915 Einstein came up with General Relativity.  All you need to do Special Relativity is High School Algebra.  That is well within the capabilities of many millions of Americans.  The mathematics of General Relativity are beyond he abilities of all be the most capable mathematicians.  I freely admit it is beyond me.  So we are not going to go there.  But some of the key ideas of General Relativity are easily understood.  They are just super weird.

Remember when Einstein did that inversion and said the constancy of the speed of light was not the effect but the cause.  Well, he did the same thing with Gravity.  Newton said that absent some kind of kick (rocket motor) or drag (friction) things went on at a constant speed in a straight line.  Einstein said that was completely true.  So why do planets like Earth circle the Sun rather than going in a straight line?  Because space is curved in such a way that a "straight line" causes the Earth to orbit the Sun.

This is again one of those things where looking at things this way gets you to the right answer.  But it sounds like a trick or shortcut, rather than who the world really works.  But over time evidence has built up that this actually is the way the world really works.

Special Relativity showed how to translate from one frame of reference to another as long as acceleration was not at play.  General Relativity shows how to translate from one frame of reference to another when acceleration is at play.  Not surprisingly, the math gets Hella complicated.

And scientists would have run from General Relativity except that Einstein was able to make predictions.  (He had this fantastic track record but still, the theory was beyond weird and the math was obscenely difficult.)  I am only going to cover two of those proofs.

Newton had shown how to calculate the orbits of planets.  But predictions based on Newton's equations yielded the wrong answer when it came to Mercury.  The difference between prediction and reality was small.  But astronomers had been tracking it for decades as it got larger and larger.  Einstein was able to apply his equations to get the answer that exactly matched observation.

The problem with Mercury's orbit was a well known one.  Maybe he cooked the books knowing the answer that needed to pop out at the end.  But what if he made a prediction about something that no one had imagined was even possible?  He predicted that in a certain situation something would be a certain amount.  If asked, anyone else would have predicted that nothing would happen.  The answer would, in effect, be zero.

Einstein predicted that if a photon from a star passed very close to the Sun on its path to Earth the path would bend by a certain specific amount.  This would cause the star to appear out be of place for a short period of time.  A star was found and a handy eclipse allowed the confirming observation to be made.

Asimov doesn't even mention Black Holes.  In his time they were considered a quite speculative possible consequence of General Relativity.  But at that time there was no solid evidence that they actually existed.  Gravity Waves were another possible consequence of General Relativity.  But there was no solid evidence for their existence back then either.

A few years later a celestial body called Cygnus X-1 was investigated.  Many astronomers concluded that it contained a black hole at its center.  But for a long time this conclusion was controversial.  But we keep getting better and better and hunting for and finding Black Holes.

We now believe that many, perhaps all, large galaxies contain a supermassive Black Hole at their center.  Our Milky Way contains one that is several million times the mass of our Sun.  Andromeda, a neighboring galaxy, is thought to have one a several billion times the mass or our Sun.

And we have recently been able to detect gravity waves.  The first detection involved the merger of two large Black Holes into one.  Since then dozens of Gravity Wave events have been detected.  But there is an even more interesting post-Asimov development in the General Relativity area.

Einstein applied General Relativity to the fate of the universe.  In his time the universe was assumed by most cosmologists (the people who studied this question) to be in a "steady state".  But evidence piled up that it was evolving from a Big Bang (now estimated to have been about 13.5 billion years ago) through several stages to its present state.

Einstein couldn't get a steady state to come out of his equations until he added a "Cosmological Constant".  He later thought this was an idiotic idea.  It soon became clear that the universe was expanding.  (That was one reason Einstein thought that the Cosmological Constant was a bad idea.)

But, if the Cosmological constant is set to "just right", the expansion of the universe will stop, but only after an infinite amount of time.  Another value causes the universe to expand indefinitely.  Still another, causes it to expand for a while, then collapse back to a "Big Crunch".

For a long time it looked like the universe was expanding at that "just right" speed that would cause it to expand forever.  Now it looks like it expanded slowly for a while but is now expanding faster and faster.  All this can be modeled by fiddling with Einstein's much maligned Cosmological "Constant", which may not even be constant.

Needless to say, scientists now take Relativity, both Special and General, as givens and try to expand on them in various ways.